var/home/core/zuul-output/0000755000175000017500000000000015147165211014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147171573015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000234146215147171516020272 0ustar corecoreNikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs$r.k9Gf8^\u|l&U狿li//|y-o\׼ٮ֜V˿] oo-q3f2f(_}^ſq^?-}ԏ|W E^ <]]ϗj;Hw70g"Gǯ/7ݮfGR)$DD D~m\rٲ] g $\xeK49)ol7(ӗofՈ_k0mZpPneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZwRՏ!#Š?0 FKX1QR(6I;Fz"站cIۛd@FNsdxό?2$&tg&Y%\ߘfDP'F%Ab*d@ʆ+D"-bFg4j1K@g(C U ;oL$Ӗ;ɲ@ :ݖ3L0x.`6)ɚL}ӄ]C }I4Vv@%٘e#dc0Fn 촂iHSr`uXWnyMfNQ8M%3KpNGIrND}2SRCK.(^$0^@hH9%!tasKZiu}THW{y|gg WKPW*g,Z0>?<{r.:;.뙘 A|==-$JRPœ*fOԼf^`ig7!)&c(z$5jlUi_η*t:%?vEmO5wtqÜ3Byu '~qlF?}| nLFR6f8yWWxgg ;k44|Ck4UD'O[l6ro%-9}tytE*,Cj·1z_j( ,"z-Ee}t(QCuˠMޮi#2j9iݸ6C~z+_Ex$L}*%h>t m2m`QɢJ[a|$ᑨj:D+w4rھxiJz硂Ϧ4Co9=]٣Z%T%x~5r.N ;`$g`Խ!:*Wni|QXj0ħNbQe絸%]zNdƭwq LJ;_ʧNs9[(=>@Q,}s=LN YlYd'7M qbEY QOΨN!㞊?4U^Z/ QB?q3yv.اeIʷVF^j=_Z{5v7xni^^J"ͦ>CMMQQ؏*ΧL ߞNPi?$;g&uw8~Y >hl%}Р`sMC@ztԝp ,}Nptt%q6& ND lM?ָPZGa(X(2*91n@^7rN_Ŗ׼O>Bߔ)bQ) <4G0 C.b~CףkB(*<[Ǧ4 V mD~q2볯Q'Q/L1+iY¥  %T%SE:!җӣ D>P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6bL΁?kMPc_Ԝ*΄Bs`[mJ?t 53@?jڞ(7h?cFnEOא&nay!%dqO՟wX:) ťLxӛ*0q}0L'd1*-B[aL"T 1dȂGl*?%|L pSROޔ8'mzX+`قSᔙD'Ad [kP=+<, {Z5׷!)'xN&}|Y>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8?킖,~0;g2NET݃jYAT` &AD]Ax95mvXYs"(A+/+o+{b]}@UP*acԇ&~hb]l[9'݌ylSO2<쿫lIc*Qqk&60XdGY!D O C*Mrii1fu5̕@UFB1l߽Imq%u LOL8c3ilLJ!Ip,2(( *%KGj l  %*e5-oﴍ8M*a~ff~6|Y,d,`!qIv꜒"T[1!I!Nw.v]zFh`QwkCVAg/X_}F@?ƻvT񟜾[mm#?,>t?}=˼l?ff>\fbN2p cL1%'4-1a_`[틎b=SSO|{krk{-3ss`yB}U%:X:@;afU=sru+}K >Y%LwM*t{zƝ$dYr;Owim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>y5u=kkN2;;#N;md^6%rd9#_~2:Y`&UW*֢v|E}#{usSMiI S/jﴍ8Ⱦ/XA PLjy*#etĨB$"xㄡʪM#z?NwGj{VjQSqbq 2_^׏޹(*exBaEW :bT:>%:ò6PT:”QVay <UKkZ{iqi}_ ּ};SN )ǘ΁ՁҺy mڜ]Lr*?rLX](^!#h k:U7Uv7쿻d)wBu^-%[ R'l}ʰ (T$ n#b@hpj:˾kj3)M/8`$:) X+ҧSaz}VP1J%+P:Dsƫ%z? +g 0հc0E) 3͛rƯ?e|+4d%ޙ-did˥]5]5᪩QJlyIPEQZȰ<'|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fGgKQӦ4}Gn\^=-Y5PI dPN6 Ozځ/פ|5) F[ڣ$2*%&h v%9HN H~Q+oi?&۳)-nqK?2ސv/3,9ҮT9Cef˝49i.2DxatC<8iR/ƬйR֌vN8J"iJ. T>)qaY4ͬlyg "]BvW#99`TegõII kюHLa^c&/H^FFIu`2a$mc Ry+R:LڕDܓ>Y:]t.+|PT6=qWe0NƏw<6o3mv8k vGOfpEOkÈWȤMف lOc;SR&.w,qk>MPs+Xh4iyuGRd֞q鮺]m S{}]U kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$gv>cDp"'0޽5xCNQ1G2})*'>fC۝'*)"5.E2IeD 2.ZdrN6Uœ=n8D-9޵JKw5ُJ,􋃓ZUꋼ0b1f87GՂ 1t_o}{Mr7KO0Ao-Y*Is\S:JzA(:i!eҎ\,f+,Ąt78~ڋ~?[F^.A'!,iGow3{'YToҝf5ޓ[he>=7S8DGZ@-#]f:Tm?L{F-8G#%.fM8Y='gیl0HڜHLK'Cw#)krWIk<1څ 9abHl:b3LjOq͂Ӥ=u8#E2;|z꽐vɀi^lUt␚ɓW%OVc8|*yI0U=nFGA`IC8p+C:!}Nh,mn>_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}/{&Ά+4*Iqt~L4Ykja?BH6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$ėcJFz_Q~ %tKm||wLTQiVG/i0xُi^b J*pBKʰq^i^?uym|iy6aں景\ꯋ+YaIuqe e\ v<}]\\,0ŕ{=_75.jᛆi> Kts]wI+dEҼ٪:3,gKŹ]4x/˗r3?#U6Ac w;<n`~hÓc/2"kc7} 3"=ҝ@|E:şs}&CᘦcI5C|JskׇTW2k WmGa+)TEppx>Gy,ْ)i-=T#xO=q-/1 k݅"{GBohT&s0ʾ##O{$" ʙm8z}Ia~Q\_gɀJD$daי(I>w?m=% ŶǤ,üP+Y6Re#>fYl6dmh.@L qטf;g+v5y"j[? @qDQ\Z~w>H )~C[UQhIMdR\"xy? o&B^K!̚-3H'fMAyX*24E^\)/3`J3cL"'""L 9i (II2_E^hNΥ,2+讟zŬe1G ϷDו{#l$ ZXel V\R*H%g!0Ϛ0  f" ŸN 8]@ $ _M4w~4VB{8_ =c1 ́FQ\.4 o><)"zmX'g H٘D77easc?r0ouMgrC '(UV# Ѳ4Q^XDE3FJ,=wLutu] jFy4 I7/ڔ66XCRDН Yrrkȑ F[~S[a45}wG0roюQ5CߙPYE1w)mhC>}8%s{sx4|G}A  !1*ٻOV[Eɳ:tM8^bx2IKhÙ\) "MI4j$VM] 8hlw)i;lH~!"X1E&ܶ4(S'It[wiZzAIOEFsD; \L"u6}ʇ+xU'`ChcU2 p63vF8J+}%"Ke ݤiB6>(F"}GQb3,_чXK7bݩWg ^f Br^VM"u7F#7>C4a;ce"Ix]mB #GUƘ`E.D a٧9_̪r2Cl$sܮ }->BWbt#A?e29Cg ׫ ( ড়!4_5,Hr񓋻MBr)fkގ*_>,ߥBzä9!^`4?Hd0®u [Vy"b"6I_~=orCo9D($yv oub$pID=|ʽn=S,ǰ~>V;-!7Gq0PS0(FWq۔9<Ȯe?UIDPAG{`6N7S;e9ߍR(K6_TiH[0:~=$|f]nR \?O0HsO* u%h\sVy1@bоf" /BV6{@TQyHķ<Sƒk}Ûʙ-:aL0\R)sa\;93ŻV歺Tf%TL7IJNl䱾mZi%'i.RS Α< XD96zJd ʖF^4ʊS _+OͅeZ`" y)e]:8Tse &@W8Vz}↉Vx-4@0ʆVMjm+! N՗VE9OZHՕlo}g7FE h $[nqS| 4æ%-/p^mҜZ%U# +l*9Ĥz VupOdPlItmA-Q+ ԾL݊`k}"Xuڪ(Ql9`7Id=0׻Dwwژy""j&*dK]sD&Q" Mvm!mmh6WvWa*^ 2`n6hvXO#fv.D.u9<<8Ϭκ7a4.+nιo+g߬mV 4LE%du¥PD&CC8V'x(*V̱X+ҏ8]"k hjl-:1@57VN#BhFvol\Z?ШJoA 7?;bS͞416NF:jѧiO5ݜjѶֵ <4;5l%[Uo{[,Xt}_I?^JZ=)ZɨGaURnVy>McpGCq{^98V>ݮ ]{!ٽai }yv5˩!<'ud0$rk DL׻tnH1+5-61+G4n6F y'd!7! ӻ2=k[HCt\/FvQ1=o$R׵׶jsK9}- m4Dftm u/+Jmy|}v@㥪p݅[W 97x?]W=*IOuniuV}5X=l<ػ";iUSz4VrՅXKMpVE;ƝZ|4V98zЅ, |`GA֞g3 &yd(R<moWk{U\c^4w*Gp׎kz.|ȋ6X[{/W~ n%:{φ넂6*ậM R*߳ TM+1ڛxp`Kw >l0TEa{5oUXy.uS`ofy1@G݉8ӭ}ٮ;gI\U3B3ڃ2 s<\ n09_ix-?9P}7V{RNic1t'NnĎ/չ~t.&-X+Ɗo "x$ܴUj}]Im1˷i|$iTz۶)_\x!"v;ak/=c2Eh rh&q=g:dq gPpsks M*$>={jK"a{ۄt/qa5}/r%x؆@CY>7s޵ml2 p/{#3|0 IZqkaP$eD-){ I -ٕ=tEkwμsw} rW%-Ɍ.;t_ol5F N̬ <\OHAOM/Cm-~j 굣*)}um e(y?VOd|}=GQ&WqI')̢(:՟u|]3xS;ilS׵~ay/B)TRQ LMՔtyʪY/Z'Ql/xZzwo@bKcW@2p>Y񾈀k(l#A7P;wM G=q!ޏp84x=.yꭻph,7S ΋&A2~vP N3n'&1*PxTn0ԭ#~H. @ ޏ t# 1]G|b v({@i!^<_P' 0mazg%pƃNu#n/v!<ɳ W"Ӝ b4;_$1}+PI3`aN?ӗhy_:@8>;(XF*sqAJ3D L7 Xu:;g=v?.vB() .pľv)Pra=tp!"\Fn\n`-ث LL] pPM.Hg/z.My.)?TulQ@Oh-7 W8*QA bάh$`Y{TAVs2 `( nֹ#l >^~*ave=&1ZI@q*A ئ*(}Ic{KG#l*ơ jIܽ_;[0yZ0$@DvT'L]`lPf~dW˂=%(^6\zMAXw10#yh_p \2]J BSE/(oHAB/LBg%Z4ee}9}y>%5^ *&KĵpM!WxG|!lk|:K牿LD >bq0 |-g䮮ڈ>;?(߹<$sbL3)RHOWn비f v&~YÛͪ(6S*&$$\52y 7Miθp"VDբAނສ"p?ށrO^>Diiah"ݳ ƲPeWIۢlZ{֣P wB_8$Jm FO$6 8^檖A%l]-2 1 w]_#}R pI5m=VW͗$ xe R 3e 2M,2/{k*>y:Nѧ}Q2پf @l%889$הJZ&8t߯ޱ |[>:xrPhYDoKaijTWl{Vy @|@Jd H(c;(̏؉A"c?e^ztDv l[ MA0u'`l1CGpmz^ЦsnD:w7Hw[/pWȯf~%~ϋ&l?Ff-&%&"\Bn %ƪJ5_fv3 m֭|]%}(?Q6ZӨ Y)xp~Zօ],O.&& &\jv2 ;BeMr麄n<l_ <¦tpla5;@5%m ۴zN~) ,Vy7q[vnb/C=26AyQ|}+|p>oFk7l}˻ϯ/I.AQiu%'CCm{/-"FMSPx9uǏ8ɩp /u'sM@;t8nD$P6H#&RX`shM=I 5oR hzpԬj57\ 0RM2lB#;uoV7~(7nAug(q+xzoG~`27zݵvzHv.}`slNa6IsO6btM@?pl?죂EQ{C3Y .rS7p㻰*MV | D Wln\ ү`IL 4 ˜[rRƍ G^7"]]ׅ٥[pVp!V8!4twjoצ[k'<@O6ZƠ@mOX$Ø;L|J`h5`G@6wrP {$ Y0ƛɧ|*q 0iNֹi誓OW' /[2Ę>N!A\R*궢sʞZT;m 8uTT6_BiPtwB$B!)}Ġ'*}ҝ4ڧwemjIJieO#=RNPgB uFHBuۄ;N4BG@&ہPowB=PoB6?P; v 4؝i$4؁P&@(ߝP4B# ;*ڄ*Fx$b kڸU09fP:-O0ܵ]i6^LfҪuVNǘDnUZ2Gީ*TC ʙZ`*]_M`EvH4Wס{C˂g0 wwT,l˺ OђEEV}6_}SR>.(̳?o`78 syr)n<8_>1UAm-be6^\q(EPلYhC_E?4Ece[n9>z1? !HHcr0 tD_avnn"xsKl xI0Ce|k44Y|+ʪ$Iej,r(*P j( L*~)V8BZHH]O-i-"禔r£ tEdnr S M:VS+l$DEᾷ[g?^Yo瓼S*w0xg-DݭWBXA_̶g_JgiM^9}44G{ 'U>f)y3)4T+R:V^OU4Dӕ$߫H&i՘ab*G4(Ny +(4f4 }2dqDN"Bw WzmC%XHզ2%9棼TU~rV 'Uãv,@z&KV޻;>(Hi:fŢ<9Β}6`gsm7@ ; n` ./U9lXVUWig ' E Ec?p:cJ',c50#bShV11^ypq_W{ @l)muk`ϡ"0WNBN=뚎eD-b%}H\T(s80^f`b6W%2بO5PRLw{(|r5k!b ƢZRTz8M-JH/5KRK>,hq' D6@D@}q5Zbp}Ug oDJN5$p I 8aBԬi2}M V~Z0ւlx^{d\ۃСCo(/!2'z˲SJ"LVQTqUe>yMdRԭ YC2Qð2*&$?KS/FV]4UP*Uy?(yϢJ&ycD?9fx4E#wgryg%c7=C?rK%g $T<=Y5~ep~:}4!Ce Q*GSO88JnRiYԪ& Fڄr*@c@t-s0VLh&/T 3`%PƾoI>1$E"@JFCOJ30.m LHD<2'+23P;8I~3P4ͰN =fmQJ3H~:j =loMcܮ3#ߖN}GHTe8(Mp5)kJmo0W#'2S"޺Z>--ߋd*Jumq5˘R,d^Hk6<nʚy2htdb`z3)z&496KͦHn9h kJob5ف֦G|r̥r占ϮU(p^Q?7Tj^Zu ghfOKh݂47T,v^[E5"+Byhb5EgҐJ#,\S"a:# R]9S|B;/ޥᄶ]ހҁ-!CkᇮͯmgkzWDYwZ}׍-a{J/QW{Y49??3_9"v_ FjQh 7UatUScyV$'͍^<ۑ:!P4$Àh'ðJ#n߉ /T<&ȁ#*n:iXtْѿȸмhM5!Pp}JV*Nb57ŨDGDF$t-+.b#|[:AY"*[&㲭(^r3p8v (}F {H CbZZt GLxE+XHV,&sŴ_NmEu܃3զ}gڜD7]L] ß@9oT&@7$g%YBNFB\ŹMQ,;>. *I1<)2e̢[uThi/yV\b>0yb{~J(RCҒG奄";bbAT $[EzשxPAL ]h'&RkJ?߈ql"tx:=H3Z\߮AoW~}~|*\uS W/?{|&xϿw>^K/^w|m'?:u}kꯓ(/_fݕ_]u;LO"}'G oyo,B&=^./}棸渺 u)xv23z/{~+wQz@;3[,aF!P>n:Crc'p^؜jv3ܓɕ矦?^VsJԬtTfIW 76hqF}y vklsfծg+1wږnE8:·6FfrqiY sx)ʹC_#{# x" /5!m⎕-Bso[1e80Sr(xH̃>dFŒY0|V>`$#e(vِhPyr՘n-0mg޺nL-!+BhBcr7K5Cx-jjNJp/EipBܷ¢ᄚn/qp81e.U*!tuՊPVB0: {t1azQ)pؽ@֟!?fl0'yEL+`;oz@AU+S LBAE҇VRv_ͨL}Hbv[).磓 6KeC,vhcyz 2K* $)O8ؑ4Rq=Ay;-ӛIPT~QYkuJ"L+KmTDŽo1Q3~uRC835 np_0.!Vg^p5p&#ؤx,ɏ]b׼w: ! K0Q3?!XT0bnHJAU%BVrF5,Z+TPY0LE!u0[ 6 52q?kUĂ9{1?/Xh`ieI]&NHb)_)_U$A;9 1 ep'| !k\ BY(RjHcSʁuc!z 47,JcɂMuJHUƈR."wʨܡW&{p&f)ô뎚wpjZn\m }f$ZYvarnQp0W/-$غ7k}Y!)S6;I4 O0㦠*DJSdˆ!MdJp2g5*pE$ܿ(c-]!ZL>F uq+sq8~.{6Y5ӭ 7 :TL`iLї"Igߗ3 =uAƹi8xĶXxC[!#`F@Zr̘|8g=Ag HάUJ=-2 .PAN*a$bB 5pQfOٌBɖgٙ{$ri=G[u>JL U!cuNk9G3oْh{.#[ILvvQYc &it{(>Kr⤑,jQ! w0ش'8(%W7Co59f"%ӊEc?ƚK y|[ҙk>ջ$ ք<}sh<:&I%̿1#g-KB:asO0`1/b5K6&mQEF(!QK6nɌ 8|QfuT,nͲTf!JaVDƜiAuفU*DeOIhXI2s'1@A s`L'`os쀅'A L 4'.‰LW5RoZ ߗˆ^͊=⺾=u*GGŦt\R{zWi:{Zߦ9K SX¶/;̃ ^9?C4z*9z~jl/T9L RȞ2b Fұ͘8H#0v%SE"b`y5rp$C1e1e'0"1ktQÝQ8¢Z _m/ڎn{jv;v果\TzkƇpՍIjQFx6EoqyIp\6$D^#ILV/%"Ȁs%;U/]w1MxT?@4TA=8z)YߑGriY\g$8ÍiJAz܌[p#.+?ړ8_CsC_3I zQT>k [ˍG;06, n",j3} _'27rp5chS]:3~ &)[y!v}iAy&15Ӟ<`ߗY}Fyz Ea,F8k{&qpKIY̆9-8Ի^:3ZT,iy%YȝO! x.͜- f8~Sl_|М^`Enױ6$hK jvYN^:j7tW=Wijg6kO$Yk?8uq c4[FUeMպI> #KV $zH:q[Ixއ&.\gFlaJK[k7Zċ5GFVUgŌcĐapN$p2zA+eYH:O$8'TM[i( K&V($#܃$8^l.U&k?8Dbº ˉĦXE0"InQ FM L9`Ry,/aCwQ5u s m70ž4m.unm5g;yHpuiiuH> & VI5i$k21 1 ~iT! pYuwf'^5<_$X^eJ[Zt|`0u&N!E0wU *l~sW$:(7w "jvsS '"&U陂>?R8E)&2G#JtcĐ4rr?sTr&Ap4QD"l)<|.7nE[DǼ9n2G]tIIy$ 4:9"W_HA#дc)ʟi3Xp`95H ےc=YH҈rJ S5r<$D_-uLnu QMLA,eP Cތ<w g~ -Y۠6j]ԕ噅\m!wP󬄇op+ڒ=tFkQxe'"X³$-2 #<)eI9(rSMt6Em1Y`6=|9Q9ϩԳ B,0D-QO= ]t`39SJ޼Ҙ!V!;>δ_eڒբtpDm>"Xc3!eBFٴ$XK){c)I^z-8Znm}x'N](CMLmi-;]|-]\4L%0a2om'e.ϖIcrgn?88Osipb;}$tf n5x-jj 1Y:*Eζ#P9so':<_wĀo bd~_ގFQi%+`[l?qrKm/j߂xWڊ EeQPQ*X>)M&Bj/||ᣴVEd}Ɠ sY(iS64ӯJ{`f7r2z36ʆ/]frfӽ^}0%؉9= "WFI}T.M9X]%ZRߑow.Q^7S׋W}ʑF}q悜ׂk D9>ݚG=w@O A\If ,7B4q4tpUA>_º<9{YđZ',4Ѿ+NU8URO+]v;?{a!}=\;N7+&n #;]V=$m4ߞ3\{5{?s&5gD7h Dcf@\^kg@ZyV鏓q $ SKQu<3}F(z -fM3-zW=DZ4jjMC!$ b 4m:EX2$B7p-TLj%B;*6Ǿ 5|-hY+Z5Yܲ[9 8R5i ")dݓApqDA_vtc֋KXSK+iìA&YkDHC )ٲW߮%T `%&y$o5g 9SB{tء6JՁpap Bv8B&x;18oiw=h1p[%QIJ1w%KʼnズcQ?>vWSҵ͊}~n^*CDslVVtlF_Ki.Qe[ \~tby7 }p_1ξz;. ߷ į%h_55rاi5YS#|[c%/PЗ`ŷV+ >s,Ar;{C/qYܫo_I\W~RWD*Զlcim%/[_^eQf8nYقu3sݯiBMkupMMT ތ&^,z/ܪlFi@+a_s4OE/{BMkfme.C`E⧧矒W|ʨ,6jNce,2j&\zy6h_ELI$~[VqĻ$.]VVP[䒻xQ 5{D;b'KD &8 sdo; ԛڝװ F+Ǘ=e,6ZF($jq௘kgdBS)J1ˑMP'"xi˨Q-˨y ~}vVT뽱Esa`[&$ƚtNdJcp SR8 4F 2|yΫQ=Ϋ4y晪&N8ZNKNdOLZaJXH@, ti˨Q-˨%T{>7\0MV I B"7+?h P Ia%R'%~J'DclްnG,nkܬB[6[6l^>`Lq~ҏqE&@`ՠI?/Mibҏq,l'xm$Ԑvb͆ 0ÏvQN&@(%.}v"OKCwߧ8(N"B_;Ȍfl|f+Pma~&s R@VJZꙿ,9)QһLπ+p[rf_̀]VWʁK8 ZVWYeyD~yA4:cBV194BT#F ljvwγaw1 Y(Ūaz qOWvÙwp]Dy;k+QH1_O3BG/wl>Ⱥ'?]t6 Y:h <aS36Mп|R?q}gZ4 TuZPivZy=gEnˠYwҟ6U0-Qqg݆v"@\b/h5~?Q|5yU{F@;*I|;n@df=R_l';twG3 I,yՙVWgVW?JRA:qՙÙV9M]U,DTKf-ks* ?"%qctO E&ʳ"@zh5}HҒ_U7+[D+Xd@Z|ܗ i>dڷwKIFw7cJ-ו_GޫP|p;8GJ>+&Npn%J+)E 6Ԗ2 `CM-ٔBe+ 0-?0>j<@7jU4FK0C{#K$F A`Jqcbd/.M%ҸU[L9w&щp1"8D0 ($ $k4u40N]j\',iB.s[mľIpOL(M`xU`WP0H,I8m5`RفC^X2D*3`01O\ʜ2.FcfQ) {3cpMi-jFh)VQG f ?JhXQ4 r$5)R'$bTXXb(`ถI%B PJHƹ؟Dm*%b$2pD8Hh78Hju'ql ZRSljéeM)B6V"f9  ^Pݐg\2a Vcs!!V7iMl4Z8Gqst㚀JՀJ}oǴA_v_Wث,{_ XZ($60D &Іl(јӆV|(H$#Qq)%LR '\9ЧQWЃMQKԗ]߭Y4#.5:YQM <)R[bc +l?~g 9YE{'?xM>|'Mr7geF|]Fs3|oM@=0n9Q%Rɠ7Ny?@A/(cvU݊EȂ;:H(>osa?wQt nWu;7J<]1sL'l{ wg}tWˆ;q}-Z 2*;]S9FG"Lѩ#!AB h=, L_G$_o; 1a/D Y>&$H-l LޏnW}ћ?8hnS1nrı6T8KSbRpԄfHmlJv)] oȵ+ j~ٴ vHXb1$ndKlA=CmG6GLĶǙsfΜO P43q$(& F:QC7@:VaĐFq"#s* c Ju ?֚XiYM/Ť B!f(017VIHۘZN,$)bb+-" 3CS2{ZXM0,;_#W0ȕ+e7jD& *ȊX208Fi\#LQVd o8b|P.c8 ED}IH@iDE qe!h7ֶj@5f,FLQ!LbE˸XtZ1˓H!$xBDxo6eOsK>w5|>Y9%:kA0&W˩KP/IJ,Yֈ u Y/@& ).N\*>2 q2 mlXQ8q!ijX;eILAE$I-P)$cJ~: dO'i1v @j\ -fmV1*8IX$ 1VlYBxdRDP e0oAzJ \XFЈnb"-E-3ؘ˜MPP '8Q(DWHkf-i$J”BRZ48 "lCV|hVqdEB+Q)QWq#w EX;4ŚHF!0aEڄUڂא7| Q hh`l v r 2M#00#T#O$dp#SὊ Mq0ot^Gp<,Ԥ@͹-* 44j5ET tp8<O,ZjTP5` /(JR'LG6ܚݼXEʹᩊzeg7/9zۏ\: X KTCIT@kRK'Ǘ</Ε֋`Jȥ|0J ,̃Np՛.qߊrk"ErҿGCFvmڀygWK߇𰕲k K%lJނD L Q'A~ASҝ1ٕZZqidd*LP4t0р#WD@-rX6m jR&XbQpEE+z XL-=ƭڱ5ؖI e]%>dD#sGFCqR*EԡD*":;VG& 1rɰh8ˑՠȕq@Z^Z>AbЍBdJ:%IHݓܮW\S+ԨǫXXXOfSTTT: cus:cu䊎=DMjńS9P:"W~?CC +Ci?7eeW6Eů89ѺzErSz$HOSZpw(B&TnW)H1"FDqWks>dRLϴV];.VjE[1ìErЪw#X!nӒLNfH;!Tr>`|\PhK @M4 `z| :*,l9UOqKFYWM|TwLɏ`uGn}xAM(W_ƪ`wQ0شylRM/Vi  zޏռԮ߶ܣ=æZ~oZnA6+oQ8{KCPΛ,E'Iu,k%QR<6kǕ%+ -l<)mO~:uo7?Gvf>`U[euҶl?;I J;?p;Xnw/νr/s1ϞGY.V獬CE +|< Tb4~qz?~_/n#lL>}ׅ/p=P1j}H>3Yr㢾 ͲxNcX;^)~RCP?ST*D*QVi!~MChJmnzxfW.Z-7\xBRbZB>ƯއjfMXUv>@n+ 0+mV"cq<=/J<(\ͮRizYG:~,Ίn?ѵ۪!|4iKwU7WǤ([!P1 c3J*2c˛猲{뱸 eӺ~ei.LM_7Uny:> ѺwǕOJ;ͪ/|\<||m鉮[TPBߗS:or=97ENQͬ]nĦf=[/ʽ?p??Ko~vJKS2-uBLsF[KJq.o1b:sxkտ/'//>gmR6,~1N/йؓ8濆@/BQ:uOݹٟ,<ߚ >̛@NUlrn>T<ΟL)~|~_ay?A;r kuS|V[rAP1Xp gAK}i5PO܆nynf'(܃;uTf!|osj䟽yΦ}Êw//z]s~yӜEy`n0l˻vku7?WW#<_7 c=dʤ|m1zo H["(C }l S9y(_440BנyK\ɵ(>ؠn' I|=}1+J_=wM@DC{Nn-fE~cf?pC [2bCiF6BJRR!JMPm&Mh#g0^d .;o'I]xQ+35{մ?ċ940pgovd$a3w\iOΜ3?;MC3_4 0{WV/p=,<}>蠙 Dx&)/IeU;b+֐ Jv6wԣHRjqJmJ1rוv{@L*-aڅ-FA"Z=ޔ)LIǐÜa #D7&I$'~L"L;}["@""K@0#|0#w kq1V1nmBd]Eh ٱ+pLN}.BYP-8 C5a$0_@- od,yPI{Xѐ SZ(ޙw*6U=AZ>1Q& fCbzjZWTc`\NJ:^A ocJpJSRVfU$ӼB RLT/;é ,uT2&@ć`Z &#tuGm~`V+{&P,ftHZx'jסWd@N;:uB-8DhJqw~XՋs[jm}T}'BnjrxXp155)CT Wa1 PNM,}g?O;u) lW-}5v;}oʃjU=t۷XLC S!(t Sq&-A=}ޫ|0UxTA%&aֆWf =zն<F7( BbĐUqMnǍ#9b q" $Yss<VZe>+{\jCNܝ50 Kꫩr^vS;ʑܾŭN,^˃!aNX_0#>n寛]|xQo/_jc|joI'W[Y#-5JtDꖟBLq_3,c\~NN~bf|afFʼnq|̌Z1:{&J2a,S=pAˁp퐌eJqsrWmOms :#fvU/ht'mŐz x}ujWe1Le%em2.gg 췣5|i1v`WɭyuS.}RtkMl~[i IPf%\tuBs/kDߐ[0ݕh/ۿ(o̠~SG{` Gؿvu}<:K-&]+{?D/, xjy߃za7_M3QW/,0#zXqWwÀ=Oyn&ؖu{7->&uГd>I24K};EbíLJ`@#>I(nRs bs@c@gAۙS[gR ~fl9 |fq3;ZṪ*^f`~ `ZοjcN:]nC]AC0G&#,}V3h*( d9V\"Civ&0JT~d_KvXg{M0l*GYO!f4~ܼ`Lla"@SÃO`x2 k<]OM,{ZtS!cξDŽ&1i<]}cCZXC~nl֗g iK K Ӫ6\U=2y^%Pr6Ub>l)'=ǝtàCcGSa˘0/ZXqXKT0Y%~K` wPiܝ(K O0Ȣ u<yp qMMH=I,/I&'$Al_N/Ey 14ehOR#1({+Qd \O;3D-…G>Jւ+K鴃Gʘڳ z4@lgZEOxPVq?.Gyq LxLn'ZP ț/iϑ ?&cW$&:<,Ȟ{~} )]Փ\G[z"?XX Ls!Ff/Bzȷ|L4"I}_Z=(;NZݛO YX醬ՇOl2(Kh5bj"}I7/fl,* m[_}py[H{UX;iq/>5]6d!/ yi-,U@&L'^h@E}''M^> &7N;J2e$N̰*-4'w?e@KUtH>´kR0:j*<,ql"`CgI I:j*<,oФ>9dVQ܉d=|9nĵTA\ ~z'~k Lkh|̇M oC+"7 z7M6BsegwayF˱6Gg`% 83 >uWsށ_ymNk9=i! s8;ie# 6g̩hÞDpG= 0qI0OIt\yюx@3 { ԳC:J+5Qv#.Lͥ~zس S%Iz+0fo9 t!L&mͤJ=$]ںFp2C7L[SMiSx֩g45i.Nap`cu2ah҅{y-D=Ӥi*dC5hg,,}<G n{2-rrR;2T0}t4 V& W83NsI NZuNubj'9P~^I'vuTr;DP>7~@MT09c(;a6cUVu{s ޚhvUee'hCLg -d^ɘgȡ#UT>SX4yP+Jb췣5ߜ}|s^97m6=SU":gӢ[#nUL:V~䶆v3$ı FIߌ EA s5M 3b)ni}*4!;&/2JYEDP2虼9Egiӛsy\ ŏ[LV~t3`ҁ͋wJ1/\BƂ+ xzps\rq񠵇L@|dQXM2Jӱ|2XuW:B6;J lw?>)FdW;IL3ho,j{#ͳ]LzM|^f\q ҃ 0ˍbE1֘W.NȏTӇ7Q 8ē1 n Ú,)jc[+F0%"|+-ģ]RГ+dQ/rQ=5hmC+]C%͠7F'l8ub[ 86*ġ0i ު?0աVk~K` w4$CyUkJ=)ٺТXzMoR[%zFti2ofwH{ݨ^^"ݸPQ>hsyĭcB em${S ;Frf짼 ړTM3y`/=tp=i)e;Bϰ#R)o_S^#)I (ڡIH10ƶG$D7NJd@RRnoɆql^R=#Jd;U E2q&ײoFxw䇻w0;k&Jhvk | y}.e^xBG[O'cxxS1SI?vޠW~{zFVGA7oϫJ~Dn1TߚO?ள׫Qt~C3#l:^[٧㗃B˯}̣%/~[-+mnnֆ{,YT^R痫yx«?GcoS͑RE +(O̦cZ:8G ~Z#cn?OS==C\|mdCw8iuȾ.ϯOyh)Cy2P"#B\2Pb.pv!W@Fj>~y|]il~nV{-A]]XVaA5kWFB{18wXb Cic[Cy3Ls2D&+6h}%%aM` \{S";] ):yzS \+ʒ}URD9F,e0M>UYȘ3 N 쎉WUVULĪ*EJ#s((:"e:XՇw%ҥ(s$k+;&V,IspVS(kRTDݳK b1ֈD%Ƭm uK .,3ΗޥCA#~֗U1d((4/cwb6lnm9I KB"*Nyg; xB^T"hYUX1;[-gBU'cOII狀_ EU]9Gd^A-R3{^kf9(D"vQt&@LڟzsusGF㓶lC Q%C(d[5:Ob6l7G!H^UY:QS#J̱6gD$GY0ۢEI2{gso ~0iF7[(c} &GQQ~X+#&eHK\0+lj() (%!BB>im-Uल8DQOzu1뜵'Q\2c I&l.-6JR`ƨ1K"eutdM(TGhUB7W*KInsN%Dt F2Uc`Ủ)w \Bm xHLv+ ,*T0:[!8g֩жuq%naD)Rŀ!VMPy$v9̍@KDhcm5i ,@Yw qah 24.T+TRTmE^g*pAY<0Ei1kQ Ki) o6/`*#|UC.0(|*b*1x_$.6z)ZpUTIXI2  l62 Ia`PP W!l:i*|X<4aUEQR{Hwu L+h#vlPAޠ|ͩ~B2`Ɛ)ea(w VH(, Lhih$mfLV"$LJЭ)) : kytav#uD]L\΁ ̐b]v.qoe w[*MZlYBP LF0J8 ,seԸQP5G;Y; JQ @dJEBR uJA]لR𣩋͆Asud$E̾Xpp-iT} S@!1(Z%J̣W]ZـH X;29xZoVM8Nd3u&</tk;f_Ȋ1"Ee ՉI:_0)`FYvqtw֣;-x %cc;[t2As^40AƫE G^‘ `-»@ɁhKQ#ª2fa1[;E; A-)J#H VWe2'E"`҅q|ЎT ×Te2jnuG͠ ۀ8A/?ɂNU~T}%*y*֝C@Vɒ3AL)q*?G_nruQ's8-f^`I%>b=^EDB.`?b:"J,tҗ 7LEՒnA׀R")e`OEx(%wZ%LEK I(sǂ@A%HAP!Mv%펁%&m>rFQ f,nZZ"̓ !8 xdI!.x]q6 $2S"Io&Ce~Ѓ\@ vPDd*p(X(wJüO( AV8K$@ DEߣ\j%?H wHw:ʜ& HZI(xf mJj[lH*6 2Nn%,B Hj/V۴ג|0dJ-K@h8Xyn|NG—h8V}e2* D8 qu3(D6Q\Cؘ"=B(x(r2|8RjdŰƬ5 U4X;z`JQZ^g&Hl BhaB5gm} ZCyÎ|+Mx5) AP  IcTs$k+@(\+9L@Z8pԋ!+s!1&pJ$nÄ G5>T2'ȝjOQS{((qKnCIb fҭ_ F<"Bi0iSUѰ]CC,2ciVMrA$S$uQZ UeySYG%>]ш;*:d燀<@[zq#e[lQ;GpiTF_;`q9\QUlf9>[Mg#V:?/`Bmc>{ bܱr@G?~Vz9ě|5[]|s ^vbZEmp1h0&u4 sY7:FxQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQu֨GKbJF<9'~d:@kߨw_ѨJuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQ`:HOɨ.t:`&cZ7ݨW4 FnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFn9םHv;͇K8tu0ȵ-o/o ?Qck7CY ~y׮f.CgcKb;Ob"`We+:0B€*CX\;2K7[%.uTTo훇zǭG;`:HH /^bVlu>/.n?ö,|vR* f/zBNPmTni񹾻 շ{&RT3CB= VIk47$4n]yc_!͒Rklɖ<fˏWWW޸y΄Oo+gf8Q,Wo7gwqy9b_ʄoH0 O!}~Ki5nH~+T(YSdHȯۿ~RI~} %邘HI5 5Q{{PPo/j91)7Z|"ZfBdj-tQΰ&3,)?$3"C[^ V{Cuw/:_ ϗ#Lsgʚuw?̮Vj@\IyL鹖s50oD0կ$' = y6owOv}d}@/89O~˓ DžVz\d-j\ ˼x=X j*`3&JOQJY- >p;/ñh9ͺ⩣܇~;[<M|s=yx>{:z^Dߝ?su-non!5@X,E{;YW7zQSle܎k[<_h`j6z\{7TV}WŻW߬H9= ͪd}dӗc?t~XI)R8oT5V׍D}3k?ͪ |}V_,OH^ڍ>f&\vior䔲v*T#wxr{-(-ұP$INT nXfa%ei6dyu#{~c~Llh`Ot"z"`] gkPþO'+p{N,ۃU{>Lj~ۃ%k4L=}957*7͍K74zG+Q{n ~i=I'/Sr"`z[/ۃ՚oϚX"&3f"`-)HC{okSa. 嬲{I˯HD9:݃χR'ev `ӧxe8\~e"s|O~Ȑ2.;dDJr>{DB1=g{Nۃ5\z*#kB*G`Po s-4#'-On&&tͦz]SeaQJciggy [3t,Iz',= 0='+?aqpn}eo~OgW͐ڷhp^6ZV E9:9`h; gj.ױu?}l2,ϾXZ-6Fޙ団%N$^3|y,jxP/6~}凳O7_pP7..[\^mDRmMcߖuzo-&}7.>/wP0}KѯOϹWɹ fΟHg%>Y9Sen:j{7 e|GN:c :dXʹpvh۟D޽AJ'nS gW%e?ho 1>Nmw 6OxwNG]N+V6/vh|泐}ėuua.> Rv{9E\f38[4޾M_y\{mR(#.H(7̧ f8HO6i/6?f|`1o>aVs9lS_"W] Z+vHt,D"62iR?یi 78i`Vǧ\~͍ F @ZMT\Ɋkӆ,`Mm %|W1Dh}&!i-BmcV)kfn,ZMHOtcƖcXI [ڤɖ*2B|x鳵pv &|(mL%6`5P hҶP rNJ}2Ȫ#R4eAC gQdi.> vy>`̄ ˕3q3F0[nFc(A$Ryx&X4GD+IS"NJ$QKUE1* : 1xLF AM`٢I cA1J#HĄ&5-*J2VJôJ6 y])h^\@&6X/dځ ›@q+*q,:;)*`Jɩ᫈;-MKdNЗDpqʃ?D/F,~ITflh'ϚXsb?H]!Բ0gi&H Rh[VFkTJ9SneQGau j ovf$.y6lU2vcsJ@s0q:hcMoWN/pv<ܦ:]-.֙@ԍ++D6ATpnwPA(Xu55$祪)9% k dLPNj=^%'tz7%xKTMМP/h7<"dsT=iPʥP*X"2!%, #4#))W!z斍Mbd ?_uEI}' Wh#Go(^u63U vbCTkz b!F՚Zg r`-`\tnFD2a(NGrQ"7 gj$ǚ"EL>9JgR @&}*@WMIAtcrܲmҽ*U|ڍ7H}Yw԰* Y3:m@Xx~Y#-h-8 4zȕABb)Q#2%qTh=KIa bٶ\`⌕$t1?pi?ֲsl&ِCL*V.: ".ńE`bJ8IDaNA>7?7˛4N-@{X V' "KۮIVO<[7Y 5PX4+HDk_ Nn7>HynhYywI۝RR4Ȳ zY@offBt10ekl~a8w_# hɑR5Ybe\j֚LVb&`; X䥙I5N[`_YJj׺ Pwǘ$tfQ۹$I_:1kfVgV1F1,ss, XcPa^֑ z+ A'}"'IG9C%xezY@0Dj&`iEmgu<%Ųlqs,‹}` ٹm^ QJh>|SBG9LzzXaޙ _;,\,-* XBalz&`km#ɮ0!FVz_$  ZIDjf`{mRh۠m(>]ut׭To?eUX[>2̽$6^&_ܞ`kc;,# ai:#G ~ [g^pfħ{LB0\־k0t]ڼ|oo^egB6°_%HB>SVq$T39+!{E!zY*uRd"c[~=ot KJE~WokדvUW|nMZ"M׿~Of>dd~HrO CD">${~#:ko?+;?nH˛d?Svne$1݀߯{$G dqxcڰCBX}0[&ݑ؄c&| _^0me> 4 Q ?I X& dH=\=Kf{hyv>$aEoڧnYlyunL7'Hl/Ssf+ْ=IJ#aom=#K`5H)k XK#.z$O ˳-_$EhKįX +L3`k6H2^N?X~{匡#kG9f=z##1*Wϴ6@h{m,/=kE0M:DҴHji!uG֒ud.*{`eÁKy ٸ3&Qx qd֌f')6c,Q! ÙkF9[z{ XY`5\Cro ` c[~~qi>d|E",;ccop1~?,5yU(l!2YtnNY^sou{ujW~)g#]pVlf[Zt~)ߓPПd!ݏ PfwݫH݊4Yl_͟Oz;?^wuzx["j~GHnyY?gӶ.\zzk\_-+aqO%7kӺ˫UIv .tgP~PQbYHwPFH}zà3v <4?3y & q~ }#iu-i{,bo䣩Qՙ|fNex}L䋴Ҳ~{>ޭ7?NpUmߵM^DE)+{Z~U䌮!oώrƒc,UyX}ksʖb3r!TQ*4iRUjllTg;cMNyNjvK6* B1\3q 0;i2LζZKŀQ#)}Ck5X*HZ,\ &FQ9t2R9IYQh}!(՘-\YSD*تkLd0)ׂeȦTgOJN{m[z BΌiZ{5cvW\&^L p*.!8"Ĩ+dl0h__"%MTHgpx ֜aQsr>lf3DxB!5y9 ī r^TR벍=jH%%Ql+iHr)'SwxҮŅunMCxDbcWRLqu$~0ZM„F[,3Ӂ'm |.jH) Fﴅpc65_vH\H@eU(Xy2"U5Hch7k=M\HsS29aEʧ: V&qE]#hdGkO { ٥fی,ieƻ`4ZhgH9%X+mS0ٽj(n`t-B \ ~9S3ee¤fDmU[u>*CE\gC[ǚ` :i4ƅUN={xK7eb```\F,2tCڬG6笢ڈE (DjGU0(B*¦t)c`Ô`; q X+ULjX*YDYe Y 쇝5h4Ҫ…`x$E\#R}S dLb5Gh,'8NH&`Dheqb>3U [Yf AXG4%X"a`.R|[V2| ݚҶ.,8'u0 |GEn  ֖fC), J vlՇ$;6 r ֑ w/ ŋXiA{ d*3;(.Hq`j` N:V +EzdH_u$jC Aq8GC3FI%))}jzpp{F3bf%g YPH6`(Et0H(@ e  ъn iX-x_ZU챵04Nd3"-7rr@E TU1떑 8(ƄU,P;b%Dǀ9a; &pR.neYG[tk3 H6=2< o db-pĥ8u{trW2E6P|1REU2h`S!yG a _vRvὅ ;,fW5-ZSdҽ$DHY=p4vMf c&rv0pM Is(pȀ}0&z' UK`&uJdLփ-$0eR pFvz񄢶`݃zG:ia'PA>mnw`E^H"lmk' 4(\B¿Cʟ&o V^1 dZ8ڔ " 4Y%b1x TuJ\SMj.bYh248 ipc1gjAcM =ɗH3(Y٧K1ej)=µ)zC'{:[Iz%`J!#<VPjU^Gaek*Kg@ B2iy.0#"ШƇJ0M=5FT6@,eJ C\e.pA7o`@\( `Im6ՔAK# D8l9\ l\ "Ɉ!&.H,3uRՒ-,X kL[yb`!wF228ۥ^U_ޯzpƷW0N@ KC^Wжc=u'Pce}Ϛ?fXjC)6 2Vt%]\Ȋ[~\\*|tn0! klr7[Wgkt ͧeR+zrسMXƺivվP'GB/Pp:B=B0#0; uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.ԙ uB[cg^ u8{4:׺c)A%e_z|D|.)1ꚹPg.ԙ uBPg.ԙ uBPg.ԙ uBPg.?ǑIֻfo&à03cهFI2VIՒO0SG*R `تH _0uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQgDpxMDhp׏٫!P)$JFeʏ.h2g ͥ,-t 9br2gzmn+jes,P3+dr`U}|e?\ @5\4i wO%;m A~yn'z><9ɅNȅTExy50}eFܷMidzTgOeOm/ea;G2^MT ɓ_0z< ;70#kot)2"v29:?igWza2 ܃hV:)O}56M?MgUk\oh*KxWd"k^*ތT?4=a-jlr{ugp<]z `xɕЙLGXTbcsgvyEA/Xx)֙bK`<1 L5so\i u=T|.v>a;|BbGX3Be-n'vv}=V> `BP?oQΌd9td"l_a|+ܻg!wX Ős3r0:ϔ4C u|22hFTN1L1O672#Ó^y!J)1+{ok*obޝgw3ݰܘGku?߽TJ$a`hYn`21rJ0TM)6q-G Ir&'+92:t!B|ج+eו(*D{s5WL#nDsTki%6ԱVyL5ljT9x oO0#Rhq[זqXr-E4m kej=bJͪ8 o׳ iMRYSfE 7͢=ޟ}줚v [{))*rZ"C-Aoਡd)6YML֍^ s! fd6D/5ZVi75ƉUiuzywwoQC^5ּ/+0(1i(vl3s7.:(}c_-;J UsB_XfFr+tfpU5ԱVyL5ljTyE8w ġGco.I90M1e [Z1ִ'R{X%1mcpV5%0-ko#4zD+ՈԷ("#*oDG^hL}Ѫ/2}\=+WpVx9*H#|ҫo3Qhٌ]- Y8ZSqʲ #cJYMl nE|X݊o ;3 Axv?}}o_/2M|d(Ϡy̨"ÄZk|WSmxk=Ï]֫OL'J8ryu'c؇%AFޚ.OX({8X7 XN`7+S)J;IwC v|^X=dA,Q햻ņ5ƉԮƤC#˹ kAgYn<ak1D{Ŕqc );Niw.#_ 8 BerSE8 pe=Z@8px[6ϗ8v 9-==HH Eph]v)Rx[gY;BIc&-P.@¾eT«:zOs#Jc*ub9q #ɝ6(7}FjPDQ'QmUO$Ll9C [ٜ[[NP"w $zHqx[ ܛAϧ7WN:/N *v]@=N}< ix]pPt͢QD}юopM{,D3MY()5NRam·J. ,Hjqę o`m<ˉA> ,r<?ڛ\0u_6Fnհo0Ʊ! S.lY_N8P\6(+`89 mwuoWfzW⫌Qa u *aj w<7"*ޱ1bx/ke@?Pp.r-v}4<'+3sZ9H1C*(2"W{p !jF|_(񶪖]+NS %`!HQ3^e 5Ա (R%R ,Q:޷A*DOh'U~wؾ3l:(zgلJPǐwH`や!DL3ocx;B u PEc T2C Q촂vJ='>f0ːU#}V1#5DL:6Yeԃ[Q=; ٷ!F0W[[zctc߬YEԽMg#3XcztD{Fx ;dpoE6]4s@żg\>(&JCM*Wzp[Qn:ak3jc|,":$Ї8Qg~Ej`d,";|ث4X`S:^#HOvYDz*nlCQ`I7R &2l2b7{`ė=0Jn;|XjȮ=5c/O,(K-DJg 煲cʨރTi2 &X4r~O! >@W PF(WpnPS ^uD!u^uxK_@~(~??*C*^=MocXgF`PLtns8sV_N~v'\Ɯ61Spw s}6+ &4+-q^j!\qSe C"jMP ~y.cAM޵82ؗ$p?$_ޜl5:t$auiIݣjilj"UŪBt*Ⱥ*HU$-*v<1N @Kp~}Ay}3Lx|@ } }BCU=D|_=rfn1h.&9 .y8Făp2:|7[ZB֟BBP/1'lG[/O,3Ʊ!3FV`Q)İzgg.nk>֯iZ+nBنB*c$pY qHL߹Mgwh2vA^ ?`$?`eWRJ 7inD‹|t@w7/uz9tz@BYHԄdqUh%+<՘r͢Al6McBg☦Ke`_e[x@<nJ1c[\w<`"w *HbCJ^=`HIU ¨Xw܂O@I8iң(#H'fneyPOݘwRDK݈U5KɄ2H㖪b ^JʄԫKC0F3|"wM:\=$]GƩ㜲tS *O@AD+9W% ʔ":u8d]rvk2oGg>$ 2Bd K9D@R ù_nج7qy{8W~|sNav5G'3pݙu~6AmB>nn ;Y$ll#} # .q+uMW=k: -h|ucz-=A) v'p9Wg{s8( 0)NoAq7W]MJ){D 8<2v*υBWqu3 n, 6+!b_]uz@ULכQy:tЗ ѿ7(;8$+xb@|Cܘ6}әu`7/4M6.&WFYŞA J!VdrZ@j;-(Z=WMX>b@ D'"o1Cxg!Ji۳]-|UJA ByAj ipLJl]h ڳ% W r ˋPr-pmW-#@cܜ_IWV߽A<2CPڗ٦Z|&4Rc$ -aG|5 du)8pwٷ9:l`G[e5*JW<GkNHo|%&skAyvoXO$Ua.V\vyt6|4@2Y/X =I~bu`7gI"2ج3y8m`fSȹ  sNpA9nCD~DY/qDZ-VCʴظ3?_.tq,9d M~#g˅a]6 ;$C^8QG.m?hn)M~")?$瓴*ȯSk(ѭ/`*IX{9" đ!-ㅊTɦ0]jT|~t䧫\V{dMUF6X"U;HD;䬣NF e/ >dž l`jgux5›snG $?hH4o'Ze2 D: $kKo'O__oK\O6HlMٸjMiiz |]-MP:^0oOyMݶ`__vl/4h)#NzqwI:TUp 9l({)#NTqw#Q~'K`J[,esjO`ɂw<"ȡՈ#x{JiU x6%ؖ0Nq)g 0lAudģ\Zn=5ydă@ĝr"$ rm:f=l$#Ӳevu,'oFVRi,zȄh5 ۠R qSNS w2KHUl Iv8[KI Q1T4Ƒhpp痉x`{M wbDi"QZͭi\<. C1Q(Հ\~ ߋ N ^RoKm !Fa"AzN!EJLx;FWE ♭CK1(2hJ-Oίvc܎hNxX5N~1Da cC @Ŧ]JQ Tww)Kʝ |wr HQ NמiM<>1)&ճ0ˣS'D}㫜qRZh*|RJ"eP*k >5N~)G(GY5dˤlp% kpȔyF,vD#(k6 ѦC`=Q 8@8ޗ޳Ҵ_Q !4F8]2Sok#5F$|:^⿁`M"A b1b2Fї&ƽ+@VJ +:|q◷%X5\%*zHgT9ӨPWk42FQĬ%`+3spK(A b(8[]!L$q&9Y]H|+V`3 w<bDi"Q@ \{b#e*-W賆v;B$ "J(5$a|Ԙe?^hu-QB?.h`$8s4W*3dă@2KBX:!QzQcTTNiigxIDdu }<Owz3GT$L>d_#~mO~'1NcO2М3<_͓X5ίq 1D͜u3]Ƽ Lxcb6"?rb6cYaEZb䝵.:ܗHq<^7/k`:i])^TMzF;!PE*1P]+cA' 3!:n֖D}^Ve/~INna;}Y$RJ$ `b^T;3V~dcE8CGc)Pz1n5[Ģ J +`cvJ??w}򃥣)k [;Zq~IV~.EKȬ.L:Τ)ˉu# CҔ8'lT|<ۜ5;ڜbYOc SӔ ȋ)%50rJ6ÿ6. Yd1Q/uE-sC5E\!nˀ*(QĈ$ѐW||f3'p!]-5 }f,fYztqMdO@FvG'X:}*>"Mj'mU:rT,nbuCUWxuAIszCp% fS=σ!U5A:t(S e"Y`Ԓ1ȱ}%+0]vt ﵒjrUK K6IQtAiũ "5GWѹ*{PSCYæXgsUC[?(*(s0VOEqT8Wo!N;Ff:h (r 0*K̳)/ނ0gs fXY ɕyK_xQ=̖uFke&_ NhQ1"y.I, OS7=SVPavS93(o|grϹl>6ۭx۩?^ճ&}&P}K\zƗ[h<q4/ѬZq6.O+ҏ{<=?uIAJjMb4&TMuúq=rlzS$=/#IրG867XT`z{ͰB)( O֋ل'~>:}hw:}S-rӅJ/fb=Taz(qq\}s^)*TĨ5;1M fc5s%WqaԜ9"Eu{x]/eFq͹?LDž7gƙ6-L' s .[*H+Z)7D]k_ۂj$Dž7G7g#iͪ!P15nh&| ~JWjͅwf>܃ΫmNV9cAX?sc#l|-!stALR|i7r[Ca]i:PAu6 C?RöLTȇJHʺX75[upo̕+dp\QU+ A&FPWz2*|t63l sM3ʨGMhsZRfVlPq{oI;Z8Y_f,iEIѣOr$c3N5cf)R^cJH NHAxjο|}_ 0nVeFܠ#|T^HY1'a&O|0}k79?|;@8?u f ) '`k.^'`qWmevB4 FsۘkeɎkQ>C]Ps85W,ٟ1~5pA9 xM%ޫD%8)%EEqP>n&9mj& 5v[20vq6欣憑*T┄3^ʋ8fŕÈ>4DNǶ+QD0ñ|lT1wzJ54A٤jy)Ae$<@ ҴyX5W>Fb$=u{`wOCMms?q P]+t qj\K,߲{ :oY!Eh5G'FC%,z&DžG\;O ..]eɤW+5Pd>}15Al4s<'*4J33y14K?-~:[֠27-5b6᥾+^yfuҖ2_q/kuSNxZ!lM!_–/uݗʜshF7y혴/~L-(lOC,Ip.ŧ8SKyNZ-(B-&Η9_7d;ϺN% 69afC'Tu7cSo1Z$Modl^!+Le€"e[omj2,Z*iU4qno=e,Le\A,vs݋K 1tP߯ 6z)sffKE i;p| *^%嗧#Z߰ sT 3j*0ό!lJJH)Rqce)~EU8U^cZ22|l:201sgˋ;w’_&r!,g xUԩũ$b>Uo+}m/Mp~"Rȉ$(I ]qξΛY, +-L~yWz>۽ ;>>}ÎV%}q1:>[47XF!IkIBJ(8IsS17nZtӾ&dt?%^Q?-t٬ZΦo7m[tTǩ">WӟD]_^W6y^L l=w7#ԍ6m/o8ߕߍk2Ԧ} [ 7{uւԼ2.#A*TT;'K$AYAVLۂ.wO{WͅY$>4[l-XbFzjq;(#lWTNkr~ 5b㼰\ƔD^>iD4'<l+,cpIa$2FK3ya;Y*) K#[YB`uaw.ۨ]!;x;p0qAL.iU n10|(40^WEJ8H x4]# u8e8 s Yk(+5!gN:I$$&J2Ljޖ ,g | 5=F&0+m0̒\qU0$Ɣ*㎿uu{QŲu~> ce_lxiMnO'tC?JO_U֦*.?Qnq66"sQu9{sK6Nls<}S=Dvأ[2c"H{ޗ Fhd FpF,InejZ3D!2a65PNc ńx3RZFVFI$MځՌha\I!Ti\Y*ksFPJÃֺS07I:T({MvBJoZ6ݼA3ǝ[<  P7\T׶]_wj.-h.)l`<ѴŋiRE=P|Y76hGzbm.4.?>,9LxcsEh09 9#I`[ NV`XyS~qw|XAg`w^v+ȳs+%glծ!DR%V+SVR ȥ.в~kpyJK0V6d !&S?`uY_(BiQ'$^G0WXRY ]M7-k:_Rc﹖!9!tOfNs_<68.ȓO \ 썞0D?ǒƃq]H+4ޒ` \QUAKZoНoc%cs,YN'*VDU T$RZ`ޔnQ.񒹁,%sC0|稭n2x 2 pҥJ.a E m~PW8X/;샰MuI|+Im%ʎ`l㰊:[c OL*y2!{w>]v&%?_8u0p-ߩkiLB< jK/IY1ќ(H1hJ;ɜVxO[0V} {zJ ^/Eu]iW_hst;C l\Wkt R=RBE*ǣK>n_beё*G3VqF˔Ib _HO"gE4r%ĬKLbY6pY6Ҍyl(-t#=|v ]NW*lDu諃5LRBNjsk#lP NRgWU$ԯJ(\Fnz1ă6iX 2WF1{nB ]%0fA !o9ٵM>iz> ?c2X?.ҝ{z=\p4kb*\W[j`[v2`͘ B_T:N ULe8Tje:\' u.kDP{] k 5e:%ѦJSqlIQ}y:Qï.||۱k.f!ю_w|wOGjGE!jF!&_.T_i^<ڿfAߢ1f씆2 {BHmf7Ʒ0#FglyI/|'Ƨp.{:CO+1c$W+8s2<} Qƭl'g."R'qK1{zM) 2nYc2(ûY5y6n๯n*JYM@EIRIfbݲm0[tM SWv_Wwn0x7i[&9wHnb=Av% ZNϝ(/ f|Vo ґՅ;֌$HdVYǏNcLLpi0nv*ͦ0dXp}S@ic۶6ݮaMll3uLvɂh3Ed;@JlY H/δE >{@8%1Mce42G!L}) Ձ-ð"YRE$+ ]sIFRQ '!JG!exVJoJҷuV*0ᩖQxeh{U9hHxg1AvWZ V!QcWhqv %6V!cJPrrpV|Jk@Oo'Z ˔!A"I "mIEB-{cmAdJjoz.0oTG`)MH 4,0+}pǟYgv;ѵ=.k@@_hN3֟۞/~An6cOJPQz!!,!%1l).Xy.i@Z .j] 5B",%ҘaU ”*m:qnA@B@jyAֿN6hlh5 y16k 2"w)Tp \Pn#'zC>> q@b6y9(%e`'ڿ9y3g5t /tkjh{meo/>ѕYsRmƗLId[6-G!۟Kw5u5NU#@c;)ǣ)nZ!Rc..D(K.8eJ(QLz4e$`'Hh"a Z\63{`ireY'IYF ƏYqӿupe=p 4,9IeYGĀ1M ֗'$do0, K")<cAR a]@%>6joѩy}xBp"W צ^&Mau|ߢȒdeA2ZޥLΊ⼋p\8D&醙a?ugugu% \b2]׼:Wi eepN]ZԌ=X@2=3B2_}1ep|RÔIAed_ U*!Omq1``jTD[f#؀I{P[_ru DphEy]'Ѯ +~zUo "Klb7qR+8\LxeHjt:#Sr!'4~ }D:h@ J p*@n@28-pB "բp BĜC`l|~`)J@28eKep Ğj94‘K=2>eH<DE\u04hX^xT0QRk1H)K{==e\ V߄"WE٩?N1M,ccs.r5zSmaظ ԥ /}mtZ(K 7TY:߷ZUk?ԞzLr_ G'GKn;)毥)Dhy̋ǃۅ#4`s.(~0e8< JI5?2ep XXIUqÃ(Kb Ze\~ 4,C}ki_]-} bhw/7 AP4U 6)b`-$Cb^ޅOe Rrnܑ~?7Ѫ86܉C]{9Ce[@^*H2CUc4Ix&4fO[а n &i% |I3Å F o>3|ڡ "@28!6g“$#2 | o#$(vVƵ /@28z.n WܥfX(xq*A7X CVRn2?Zg;g'B8qč@\8h(Kf:Xhla J>%]R&I8-U )az Gh X(P/M@28bzdciEI@oI#ҐIQU *.MTyz|˸D;IvR-fJ [W})Uge pg)ZEMpX<\7|^|H rTBc3 Dri\tRtmax'X,c".B~b#Wƌ*r'r4 4,CASϕETY3O;-9>]~"I9g-:{R%M:r.'s/f~PMqp̪4U (&`>)^f~1+v])Gnw_n78;ۓ~/n +W%ZWoEI$ s(r 2Ufʽ^;1+W.䬪م8@J.nXIՌ{aRU1iAeo$|, d)g:S#ˢ<s^j~}Y;7i6YXδ~SIȲ,1p^vZpV6E+#Y=I6\n@[_{_}䑕"}>;qTl"YB8M`!GP(`ua0K`P=&Yh13ѓWѼ%b)j4Y lZDdp#:d 幼R0Y_Lu͟_,ϝV;k.̄:캕'W\ wgUsȉbzZGӬ>% TK<s~lQn^e2e:mLsr@+psoNTu(0xFS{Ѽ7l^^XV.6.__fꫛ";*|:;77 r Kɳ x}؏}>8@ro7&_|"gمzoKJͅUk|va߫xàv_e{v?̒]e1eb~{t5^Eym2z~k6O2õ6X?Ǻ=9E9cs*71c8\Ud/gǐI֜4nt7XjAߣ&'u'q?ziر /rWxȾ1)gਮ [2!Gu/ym#8~ B3]̼Zϙw0#c:A&z3Olϰk( `dn͝G҉ۧvyѪwģ1b>#s2G7nl66am`{K1N=ox{;j0>QFm+Ya73c~e6A|b]M{1djno]-GKgawi#ȴVAtmu Co'+X[lΆ|le RE)cC]a4 qR*NIIH!M'Y]8|/]5[fyvrq~zv#Ȼn+N}FRћf Z8?y@'3 O/+Ba.ς? ~9[MZ.n}p>,F%UX"~4ya g0w)D.~{!E(ۋI$YE(wQ:Jo$b18XV=%t GG 1ܛNē [}1€ŭ$CG﷦'XY@.a; a,mP6(s޶eRڑ`)n$$o"-:Lc $J8$"Q1B LJt$$D$LMč/7Hi?4Ҏ;kbd5K%0yV8o/CMF9cNǷXM{Rkg+0Ja#%J#mqJ##aBGZ'&1S%#,CHE ?w}bk6ߢѨoA>mmZ37(Uw $U׾v~QhZAGnO3Erƈ=~/Pn`@a/4/ӧ esuyz~69k_ӷG?G% kE4R*AYt3X,1h"T'P0)!Ք;:\HD!U1#3)*JLDr)&Z*jJ Rr^JiLS$4J04f`$24MpF!Hd!#"Pyt2NuZo4[D2iL=42'\0ԵA,QB&4P!RҏNo8=bi0$bT@bÀeceIB4e4s[@RR;?;ibdMYaB"B2 Pa"Nd N ӆi&ETKfyեz{r `c a1&ZŊaԨ8!&LPb(fKm9VY)2#1&A1D&.Qr 2yy맋7Ǖ4*XLT  M  i4LA2  $Rq}ZEwWmPWVYt+_.}_y@]]5[?)>ofW0)ώO58jGjbAbZ˶[[aHjoa+ձWlj,&0$`m-Zjm^N/'GͷUplUQ&B,SЋHDxG 5#k-L$ST%>ySuܬb|uy/4_4[GW]IO)T $L(dJsDcf S H&BM-C*;Hj R(UT\(! X  ?tRC@5 (øgn/4Lrp ˥ VB tƚ>IA EJgn #o9Nd;߉\d!G m'_>~ovެ#E-,lrS8uS8 GHyع5w2̳E!@,=Uݺ֭E$i֭G-GOSn=*ޭuxu׭xf+IʸKPoz7@f̙L<$c?jOjWg2?9Ų**(l8;KvKGsFfiDpD8Lm4j )֪\#|jHKDQ#, I000ΚQp1C&2'@Haw1XkI! iސ |ܹ;q>ǝsww1H%S`ˣ6Dx+0Lz_׮}c_W>}c_W>}c_W:|3_W:+GK_ Aܛ!8>ߧTS}O>ߧTS}O>ߧTS}OHٝ>ٟgw;ٝ>G62#ڤȗz_D#y<39>(+5i+ɁSƇb:C({pex w{`ăÄ7Du'nxWݻwulNQRY$[3>L0&sؿY _Я)[LE/qzӼ7Ⱦr.4ހJ=\ts]0M\v~I?\1ڣ,+)<=lu]-sx뷂PڜUKOn:J;Я5 )*0@cf:&;4E1'9 ;x3^#![#5z ]TwN3t4t`b̝ܫcpI ~h;oap$I`md']]+>nKwrm͌ r+Yv"7klYn<<}w-oNiOpL㧑[4[tujxfM_77ӣb 83鲫3vpIFEsĀ!N/]L[i,[h &)[h189&?}0gG l gp.\9,hmmeD#ꠡ?oWBnmt `F'w+&f 4*5*aZx0Јlx<) x /Zęb†/e!d|&Bkz{4ƒ*Cs+aq>ރͅjI4֩r-R+Hy 9Uѓ}@jy%R+oBhbZ|42R,*X.*usYڣ]v/߫6QL>|zQh2`}o7ŰL?A@N f`1{S?qSbHQԵ+EBn4a!&02^2YYsV> LfςRx#ę'mMlo?b*z fۚ(l bE~43;Ue,tt;;l|’w#-T[`O|؂cQNԃa:^{7fr0٥zv<{]4f\' `O uPbt=%ܖ$4ügz>.*ܶGDgZzX'M0v[_vr~}8atGb$}G`;h6~C=IoG=/ 9=I;>iǣOsC}v!OOCiH/~TɩI@m_fF$@{ +͑}k#kgpppy޺:[~pzm]lGf8'[Q+ڑiRM0,W@ q*p”~3j̺\p&\xG1cL7ϴH?5S6Ű4*QqYvɽ k߹p&jt7Y];dc>ƹ뻢uo˿Y_ݎu&] 4 6MڟO$|ec.~k]E}iGxLG|/J _>_>—#|_>—#&||_>—#|_>—x<$&d +,&Ԓ;qZv=R)}q6hbA1oǽOz]k`8_AC~xugG8vtiDuAS\gQ:$n8DFH 47ο0Y#\cE<[ph}*`Q~U?m U 5 M]`Ӓ5ҺH*cȑvz(AmP\/xj2V Sy^뫾4uF1 v*fl3Ї1\~ޖ\5gJ#2VŘ9A[ka o`A/'@`qky-WW;&٪t,o` (0`H(&=]r|X#Dr(-oo6apֳڸ1: v)YphT8 q_8D 7I¿30` xar<sȪXƥuH /R o=m^#u4z;tc9t}Y@nzbut~T:~\Y֒8K';ٽMS[JF"OV'B6UY(^UB"bܜݳY&v8<}2o;pC+*;5vẹ Vh,V`h5]'EXA|~Q*߶CH *yaJi S=S-$#TUdJr Wfs9y` "t| R 9PĿϗᰔ)77 \D29"YDoYr9\#Hڬ#Fj r1 PKwG( MK7ʒ龈+㝯,l$KWBa{$,PY*qr$E\Yjuqe]HqcMe#n:LO\6r=?_ N`6DA G^L &["<#w΍(r n84fG{ B/AkvaSXAm|9c4t07~V3c~+F310ꩰx-t}z򺜧c敊*O&9a{s؊VR=xC)#q ?gCYr>Iaۋ+RN^\q&)te{pʝGW վ+Rr!iş:Sj-r O+֢'[Jv$k+զS5\푸ZF\Yr9qe;/,xqEɓk]mo9r+mJ,&>${ e%N,$ٷC{Z/n#w izX,lkT<,a_EK+FFW)WֳmZr594Zr _=zYΣz\}þU#|}/gu9ድD$xnj}(׺RmаX /g|۵B 0JUB(í&5a u!BMo\w$Wu,綑Tëq?{zQN/+v'0VւVU$F\v!tq%FuQ\mrvJP jiMcIl99u>T97~  TZEj}XnbuM_9 Z]v=uIvNl)dvfti9^Bܟ>io>Bn[#<`CJ[iPY5@doN?ěӋ8])N{ Eړ|Y'"EH$,lq %ȣ85iѦD~SH>w1E߶ʏP[*I>G3RB2 aLVu Jvk9wM uA 6$CWPe" bN LL(uS, R'hI2*93qA`ydL͒'F+P, oRI+e$i"BEצQ55#9 %kK{IGJ-: ͔Li+mrGYUc %ȣZm\4!%R1Mu#c~-<{*K Ɖ@󳈲 .bX(A₇-L[$iiQJ銓 K#Ց7:L03-Uʶc- 3TD٧mKRxXd륑506'X(A^[Uz(eT*ϼ$u^S, o.*y;9ЋDiU]32OP](r@Ǩ3"Qx 0B~i ) R(E*x-5`}##7B7M)@鼠$Qd#nmGb)JW4&oYڈH\eNKL5)JG[Ky-'"AoS(F J7NĊD!iG9Y(DI(!I%# %W J?Nu>қ4VyU由V4  pܞ޼wՇ6/\]\SWk), j+U¶~#JVP'煛 )F'm#kS, O)佦L,yG"d$&ulmȏ[(Bއ^.QF+(ً:ZDp H<Qor*>hcFĎrx-fʺ3F~m>6C+4>E %:\QP'IϾN~X!6`=ӼeDVȘby֖VТ1qi)oEDвms #'X(A^'Ϣ#%ipqIl|F %/jph:#GGj/4XhkOPSPA]&6*OYLjH8B ޻dQ+JvCkqm8r|佔'ߴ$MYeQk[!MT %ȃC#ESNJNt)5%21B VM#(5!%5M' hP6r`QmWZ]cn)y,5Rl^4]i)c*3LP-PX&c Ͱ(Rk)-1˶S, }e첅|'4lIMS, ,m%/MYN(,RB, 7Sڎ' .!Jv"Z1 *duS, V6 ]tA V$xI2;;XѭO*!BoZ_P<*9[%AYclErEEW\1Z.WrpdJOt>Zj.W#:<#WPEȕ!Wz]]FH9xyp-"W6JrurU$W j]\ZGWrgW:lErEY\1\]Mth-]\\!)d+^5KZ\1A%UG zj+B5rEp=ZѢY\1J]\YA$^ZѪtb]\ 4pճOXᭇό ڝ/xY'ק^5gHxw;w^Vc8c$Գpc0ZҬ1#$W ճp}5 ΄/83U Hg#WZwr(*Wߎ\D㉴9upثCߜajZ{Pΐ+\jW׃`mErE+\E+%W=P\1Zk.WҹUPJ$W A5rp="WVKtbW:B6Aꃗ ݻu+2ܼ҃9kGi1~ʷ Igcǿ׻ =y rAݮ!ƴ/B|1RcHtk-~m,W$6YdIF4(Z&pi8^(|"6 U_/?b:)53Y"ow~5V< \\LN]Z'1Cm' !vmx;@Rl+~ijE;:lď{OM{4kit7WWw'Qo ؉( t)DO;4~BQJn'}7:ƻxnxk=gśiwP>C,bO%EArϓESŭ|7hRm2N>eV38&0(G}TNs.w ]}oRprY//)c?]ks6+,Tm۴~*ILRd?Lw eIdw:S{/)ɴ,YaR"H^zݓM?\za X(*ѯe*P <|nГ*7]G%)]&\,eW衐[<桜#ߗ?J u-͚7LϽUr3(`ɯkD+nN.t"57*ZKu]Pwulݸ/(j` MBۋ✤`}ᯓ+ǤtNZ<>wR+ey.hEMC[k4/EyJ0o_/,.|kS<~?w{yr][ 4oσ~] oyفfQ0HgBr~lrnޟnpl`yP7?F5V7p5]q\+,oC]^,Og*Uúb})go=f[kG7Z<{^Y{`O7w~_Ə׽}4%x=w}/Hf|]iRsg;\WB,z>e2QE~p))i.`|tq t5&sK'J& PjdGb?]|A8=堇Vb;VkA'^|) Ѱ:@Ab]fqc(DdB-x`gb^¨ʌ<0]{%?{Rk|3?IrrHݨB*r:a~$O˞ǻNVkTLx[ D#6cR` ftR4EoeSn q0[ IDaA>s ltH0fKɳx埯#xm(u3R1)Ks( (Ѫ牢(uw)R$C0lp9 vBCWA)]m-z|Bvhu(v(-m]-Jwt驒R [+<Xǖl҈N2F [+U,thZ OWm]= ]qM9#DCW ]qmʎNɈ  ]!\+b+@Kid`Jjcƣ% JTGW'HW0.c ` G.f vBvtutezםtѼ\o%XM5v=ɹxk(䜑=xˊzզ=R 7\ vn-%Gl^nʹML<جtk~zx{r]":˝"hO rMUekNe!'ReV_'p7I@m޻I?mn7ΪE\ߍ꿇ߜCQ\|ME2Ifg|ʙ$nwM.2{)ܝ -!|xɐv * ";3{oӲ8(ف.aOqvfRa^}Lk+w0x~+B3$L~]2ZwV޼o~ʶ$}hS=FY[}>tKyCm UmXAE=1K=Eumnt=\C6jaGIG鞭nxe":T=e[߇OSн$l[Hz79,"O4WA`Bɡ<ʃPRIɆc%Y{Z+MXJMVkXm5 PkkkhEŕ6{ v.+t<NS*v|wl ZX?PAںpȂBVZګ8w< Z}?O:o+lPWL~=w:xث=ejF6p:DW25y!LXRrIr!?>R:D.V)I&%'ntqJeSF9`߫}N]X/8#V*JNHŸ7ݫt5'*gFύvy9McY;IJ;ȅALBheJi |9LXCRkwm^nM1aM "1 paD+I0')1T O1g DOWt;X"]Y&i0Liu?]!J֩S+)D}M{J Ibt()J c yq(BFBW%O,ӵrZ3p1p-JY.[;[$0_h=WWkR0&Rm*MJ]Ҽb%iHM<38W1@C"DSFJjHDtQnUHL QN4؈ x^" gD[~pcO'`5EBUJHo," g*xe^n|oeMAΆ3ˍ1cWglK2Y0$W%؎uM%PHI5W>υ!3Km58PJ: QQ)BN@h50#!\ H6ij܃1,"Pȭ.(/rY̔ΘZpk֑fj/>WRE 2<%%V*\ $wwyhK JBc3) ^YyVy;P'gs= lh ,`z&kLs mYN5]mBsk70h`*`-I7@ʘ04*]^F JzgĄB XaCi; Qphٻ޸WB>$/Տ/F>8N``XSX&%ڻOC\+#sıxs uQ^nN'1vR=cGYG7(mm fQUޅ6Z:;$XKQ,KI buRƗ ZRv|^ۃjkT 96$]^` -%Ġ:*5,vtFzYS(P!6:q{H jSNG3PQCm>+h-aK۾[qDg U80P0V/qS@2&/VA-chcn eeXLĭd=(R }wdu mhhJ}/+AܶrEc֣6JQ(9̡m*F@5w˽AAQl#ZwS ;N-`׬`jc4)@F&E0 d^0.`AV5QY[4 *ӫl1n2NFhmO J+Y1w|эP!ՠb~8w(c¶޺ 1ԇk!$ ( F&T+Di?hx"WX)$N fQ1uKQ `*u &.ppRPg"52\ <a+@eih!\= :7XaSќ)E4GQF C!A[Ty'RX6[APT`Hn|A)AŝŨuTuAT"z_|{) f1J%)tK ڗ eFiuDt8Pl,@'}{P(&C@StqC`B>3ү /n޷bTM%^1P,4; zP0B\~3cq/,u_&WV11󃳣n}`mFL`-$t>gAuP<~rl*MT$];-6v 2f`yG a7X``a ƅ{[ɐRa2QY,bw ti7X t%+&CFPrãVH܁m CK Wd!:T?QF*SL`vV+J$ cak_n7N0ͻ:Y.Ru+`+tme,c;KrIn5h /s` QTwKB*q(#(v_I`ʠv?}* mM!g8Z.fX;6 ԁ]C:)]`E[Le+o Ѡ;#,/@G1۰,9(I@̴LZU ה!6"73 7?ăʰ*+ǂᠻ8+B ډBBj^!=hˢ9;k\#YA pf5U(Qk4㭷:x$`NPjw;$aS6i` nᾠG_jLƐk9.m/+%@׫vsggjN h36{ 6YSq5ePYPfjѬk1i3j9%kn bL@93bW׻{%з>H#t5x7LJxKT䰮T9P.ϰ 37q`=D;>s_Qz`u!%t P[ ;fc_XA|w?uŠxMR~7w@?"o [0p\xE#GIFnw \zH` 4SclciЏIh <4 hc3(ڭxcAUJFm:3T;U`j},?mAnYo5aנhdx5ƛk΁ >bѪxi»*:ZSDa2kV3^ ̆v@SQ Ap8I{m6!,JiMГd@b-. nAlnT *˥11DL0XKEu,*f-Ԥ3V ClM]XE jl$TipMxwbQ>J+e?0 ՠL=뛣7g|;x30/yhdaCwG~9˷>|lK[*ח`ϗbҶeW/ W"Gl76} r=]9k/_^;7Yg?:?Ao8|}yuO˗ڄih~ɟo|Z]'3@7]t/zUoOHg$8S(YTyE @ѴDPZ-cH"xKIPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$֠۞(yRj"L$Ac @ A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑj:ī3u8y:4A[u%9 cP.Y HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑuVh>):LMZ cP'ZH:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#A HPG:ԑu$#Au?nՓi > ׸=}=6{9o/рQS=Γ+q\z\P c}LPѕO ? ]1\; ]1ZzbA ]#]NûݪmoE+?nKE;AC-Mn./0P_,wz: z}?,cz8ɨOt1ȫ~w!w/-7` |~}d s€7o'JߧT~!xg^wRn]=9Aȿ+ ^i~6')X[]=J ]}l7ԙ4 ]1f+%NW#+:"J|g!\j_^o4Ic!/porlWݟN-8]KYfkK ɮ'x+aNNCt<IJw4g7֧'1-~7Qz?F5ODW x1%;n>+F$'d|3'isVvb ]n< ^MoL(uϼ޼LPk+]zm5%3]pJjOzۇ'Zrk+t ]!]MR? iXث8 ]UZ;]e -tutŏz"b1LCW 7McvblR3+bz\,t (cBWGHWO3+LCW 7Y h \[pN浛Dfs2quhiQ#tutO\V~^nw0l/,<\OϒypC{ى17.^]dӞ:V|wL=qcrk Fe5FLL+8켛nmnt(#]A82~bBW@& ]#NT j7{s tOP-8tEBW] PzC?]?\^?vpޮ@\䱱$g/pd.KT>.Գ]5yZzr =;xɿ댟HR:~ѪZpD2R4z6L8*T$*֤gߤ)Sl %5T:Z4`FKd{hV:ܭrUيvh]MIOiR>R.*а1[ o_7^;y㯶,^||/,.aT羻_z>ݾmِ]'KA-Ǽ  7 y ހ^3͛{.?e7.૛/|ugm[mȗmqCy_{{Em6{6b8Ȓ"JNӢ}!%%dK٠NLg̜ysμz*d O*ᢜˢ[+tT2_>!X^}S<8]EQa$=ޏ~T˪fli.fUuIjRSWQܦ$T U0wph5.qn.~+Ǥ:Ǎu4IЂ|bIyόJLvT)yM%N q z0GY}J(ŷ`4aT|=+*&>x>L h8Ɖ4xF TG`)hkhP{7Qfd +5_ų׊W^F|?:3ee[Gw1C]~ ؄aTp"ԧY41 7, WւgjW+f u/f<_~uM?UG ]k۫q>!a Ѱl[}ٴ~>7kKkEx>3JD3?֓k0FϪs3 "R*WF!ԇ&$c}b}D/$'u@w5⧃x p׼NiT+B4c,{jp=K!eNPNvbq68߄mĨf6xŋhxuvYF73Sچd`|3*ze,m{_?5?({ne.9]˃Em N]Ibx7XTB˛l&q^F0="R EMNV(އOdmgLqJpkU̧}>2nWM8ҮZl*d\c8 VjLV13E|0LOauadNfD q=ѣ|+<ވ0J('7!wZqhsy^>r C/-6'`f&y \P F]\1  G8q I?I"l5;7+Q6,rL )ɤLpUȔF͈ $?pMxƺ.TڵgٺqsrϞt8J zs%'쿝>tn;T)hxkp(.iLȼ6~W4 HG@E Ol,%:}Vmn+VwJm`.(5sS'RB8KOl*c:˨O %TIn^{.zVwLl+}j2ښsQ\Arn5Jk也[.ѝ[ ϭfD1Op6;\xv-|`wóO%'/,8l ;Lime,Zze#" p7?zMdz2i =7EX|#cJc*#Jg=ɹ6JPfDH"2 otl0kZjiNQJn;2 ɰa\+"< P\ښ\s+R!\Yh\Veְ+)B\Ɇ]/{3!j9^wIKZENW{Iiؙc{]OMpi \L;\Bup j\Zki[ 5;\I לI""+WH{亃gWBh*v%!'_ջRP wB)k:+%2Epkg-pҪwQJŮ%\Y:yU/)EqGjR\=멆Ѥ[W 5p*BiGhڦ lq m \ARuೄ+KEOAGQPZP1dRgiͨ;Z^艗KXI .FĊg43uL!CoF/P~TӹZA<7)Fٻ糉]E[ Uo¾\\i2LB(K Ėڠ7ɿ^v`lG2r}%-ۿ!^h"A wρjhCDUPtGEϽ컢Cq@`Ce4QX\ ۋ5FBM*"pmi*"a]7Z" ĉH m~?ET_oKD]ß`R-~c B1S,Q?%xZaZ׳ڕ,zا!~N7`bJƳX@ /Y4@)LCQsX#wD^kTk7iiU|רcV^;7 u?',]ł-U*IE}٨o!-L5.8f&`:*{#3+fARD~ 7mlU{C[oiga]\+#)I4ș[5Yi]}&>{u?7(EIX'>J@@=>R 'a|IH  vC9K?sv!h8QƆx ؤ;]7wqɅ2]Z5)vI0]5Q;ֻEMY]־>VtH]-vڷ»r8jkU˒0qS者|e:"4ciˣ7!kh1Pr,dq kn0=ʭ۝ogiV C0m8rQ[m'=qN~qTeݹwG9JX|h^u{#˞mF6G5i͌kI[f@Z.̹8u3NpIb9"Bqk D'Qo6JZ4Պ$Ś \MZ:+*azk\\4ܫ7Wl߾۷31wx2/ qBXl1IF}H 9rFj!$+!X 1e_fѯ"-exPQC`R"(wRƦe8(5Ɨ`s. †!P;`yHc6 ȀR_*PΤX깍)KQzIMG~~7kBu0gtWw[oᙺ18^RXhԞ2 04zn aP=L2dUJzD*t.]8@s70 @XwdwDqWwudn[̲h[2o IعqMq۷@][cR;ĬWY@J\TknckQ׾>\fAuv)'T6daR>r~i=yERBW->}[T ]^M}57/5$Ѿʝh90x:Gsv6ADIi .N)|@M0 Xn@lޖc葎ո,YT9X♅uaA/z_N~IG Xu*{I_rrڌsSXo9OMlf@x;ɭŕG• + W.Lo`eUls`hn)1HWܷXrav@cP h6[R[ lGӦ:@;E4ua"{&CD:@cLSS"!&"s$# #g txxJA3& >p!O}m ?J0S|ܘ 0"`NZ0< =n@fQ`&P$6#E,@FFo$𨴺7poٲo*2pP삕߶͑\Rtͮ)R5naܠ?>C}fO{~پ}5=_຾>΃"EQUPRÌyqcHkmG] 8 uj -zh9g3 G=ȣTGTS&>7$f+rӇq 0EN93Xp2\IMPJqTQܦ+EZ)"VD h$-Jz$JZf㥪Wq8y !t~;hܹ>HQ+խ%QgS:ΦnuVS[[Vk=q99w8r5{`uqtd\(wp)@% O=cF@1gz4‚:FZ̈́huȥg\JTJ֪OsKW~5s6Zɰݩ*!LoͻP擄yHT{J2X 0Lqɱ72Sc4ժm FҦ`)m FѦ`)/`BgbZ(ަ`MhS06MhS04n'1 eY.C/CgMl9}F堽~nw#LEb0bפuvb_{SW sU PUv%qK]%ݘGڡ?oaMϯaO0rrgĥ+f;fz=)}#49VLc*M@fsJCuWY˖2Z* v7[s|$;ϝ l6-vH]nk4 ّԹ$#Y{VM1rA%+AiimRSCH҆9)AS6W ҠvIQ;$ʈEX6iM|lFqғez8Aũ _4\9gG:+rXd=GJٺ88舀&F¤\6R~ԩB)Fh(8 =!"=AZ"X{) n@Ŝ(aH1r#Љ)ΘpP400h(QA"V o#[S]zAS4.V-[vZe\]rV־9}ikΧt"J7UZlJK'j_Z:QIy[Z&D "PH /I &uD9hkc0dJ,2쩳`m]aPgV(y~m K`0r"b&j ?u0Qpk@Ð>#usu>~, WJ}*Qt^^OUW3r'rٸ݁ZԛWWJҞՕ 2qF*٨D碮Z䩫DUW/H]i ɷ^*+RN{VÿfJgy߁x%uݎ 4xtA揀>li*oSRu$upSq9X37E x -7;\{ޒ3|y)@:ɦs4Nj`aMx."u6N[=Lbx~Ddu1"QR )Oo>'+v>#ٹ`DSۛ93/c(J>'s"XJ>é!uPتtsRW`)F]$Yg?}jxuQWjǩWD)VW+q^cG%9R:t1?/h3_wjfOwkܹxC @y sZD)3ln%9 ,0U5W~0Wez뺕_.CpHC_zr%;7E{dY#m1lhErf-{kͪ#+fpAB1.9ͩ26GDۍc:W j@=t(y:KD,:KD,:Kr$D,:%BgY"t%BgY"t%BgY"t%BgY"t%BgЕ,{s+0K'/D,:KDTGHa,:,:KD,:KD,:K~&S"47ެ7k͚{ެ7J˚sT7^|E]=E!VU0Tֳ?eՋUv'FtԹ;٘ZHSR=ѐ..>.j[᩷Óm?9I*.VMTLs/KPB( 4٪Wڻz}W,tH{+bI76|w|wF8xsO|q#=?/')}Iz.G3\[ye$9g2kŐӓq{K^̽'퍃$5b.KnP-LU9<6`&maE<5*1$}0ʜ"5jkÜl]  yq1uu|q_m97c,oY\jȂP/z2tL݆)+8n@_k L_3eaG/JW6CWB@á@gQHW"Ꞵc CW׭@ (zt1mJ&ÛnZCRiKT>O.<ʒގ~m}X|{Ә`ػ7ey9Iɍ*5F!曣߽w{r2ܿ9ۿӝ27Gv}ș| '&ŷ<ֵm'B`]>FџwC.z)-y-5 ;V =# +OFѧw?މni3.oF >(<p@x6r2:t\ n+tPF^ ]9%u/CIwEWmC+tV']m>2tu/{g~p~h6r#=*NzГǸKW8\+n+tBrt2ҕ 8]pxckՀV hCztZ+v~>M6C+IW/87DW  ] uCnBt$t%Ql ~^ph+t5в9t(9IW/Mׄs݄T6v]d<_l] 8fr-Z㷒kZ=g}VUz+ .͂0UdnƸh6Da;c{Nm<14-KQoIUFÆsh9,9ІW@I&NZʚ 5Q~~{{~V'Z}Ge<Òܝ.t'"!`ϼp ] R']@$ h3t5z ] AIW/E!!mܰu޸C2i芭~Kfu3t5>Vƣj2PtJFޒvCYjhji_"]VxCt%38:h{Ӥ'aø_cܵTM 輱f#07o#z &56\?aБqWDoxt}$-ywu]19-],~ޥ ~A1zsߝ78CӾ kՈ H(󫮓5]'|>5p`̛G4,ߟ!X.T,su}Ӯ.q 7~|oY]*]t{OFhpki`>WK>w{儻2j~L9"Gj|KBVzǵxw #PQż|Bys6{?[aaQ7>o?5_ge0?} >CACs[!:?A:o~̯ |c_ؒCIY &j3;ex>$c(|_~} gr}C{~8>Gru~?;4uz;~[/r'*DFngnVcg9,8&0UA6\PFSf ͙RqΤj V4\mv +qWS}*; KMb)d RΗZY[7a"FUl&fn͉!v-@\D6*FZ0Y4Fzs.E^hѵ\N߼j)z)5[絋!2viL24(85cz-pO!nP`2cИ]f(:&.\L ZՏᖈ&3GVLU.fk J{Іtu:bVDRC1wi-ԇK@cVKĮգ^[Z# AH4HoOTolTqhi/d3<%cNCojšƬ*TR*B9ӐJJ6` )'kG@1R ka$oq 80Ǒ֑-~B_[FXqYk|.`5H5ձB_2"$s|mV%wsUMEOɭ+6US>dkhO=u52)90f֬Qd|`[,)";D{j ޵6d#oG0Lc&)eOh@ZE6`5fg0ؽZؠA[T< wԡkˊ#hJV**>:@2hCgG3cMm̭>VXLĽd] B2fCvXWe+3\7ˮp\ɘEFIiݘȆ89TmAPh5 I[E\N=Rc5vx X.J JM  c;k&4$\ @pPk@ou4-**ӛP]Kf"Xr2ukB AvE'=3UFnTz ! 2nPư)0u1JҚ`ePDdBEKViLsR)C(f֜ Ab΂G ݄cBl  ` 7RoA e3kK@4`CA:2< pì Aཹb %RI9n,\ BAy՞e (EzeH_u4 8 C]Tj"usj{UQR@}j4532-hƧm ɫ#*q `d>"2Db hCQ)VczhMA |f*0(gcrU/fĥ&A ǘ@QcSSt"'D̿( {0s7`Wvv˚x,ŷ*ׂYz.0A@6=Dfx:p2ePtd)Y!J@hW!ˈc*,OHv9A|`a3{=h5'HiD.0iym,}P![n 32]\ˊ1ZzOy L5)/(9n][ q;2pԆn W*YȩՏoM?/Xż*xa&k) _ #H b"җa|}Sߟͻ&hkڨ9t 'puέ@"T2"/U+WS.ǻ Xm:iϏO߶+Ù$yUPEwH7b]6sɸҢ"l nZ'dJY:Z- ݚ{)D$DHY=)4z&x31>Iˣ!MʌؓXÁwäDlwH'5\ ty݈\`խ:)Jj 2T2,0eR-1#K HςO(XRVz `c_P!>G.;"X$O\k'OnE#ǘAW] Zj &h=)[`~*< b7 1RTg]t>pUM," 0e| f,Gptw=)@`H0Ky@uȫ+4;`UFo`L>l 2i*47Lg Е3f,Sξszm-*~] ^E*K"(lӲǻ @.CśRLac ',I;~?gtJ|8PLt4pe/7r\2;3/yXOdTS'@q.?sZr:4>qe%'KMfùn34)>~u0)u)WA MnA{BYKL =!&P^(/>(Ki @D޾+2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@.@7]b;'.0@ZvGzLG 2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@ϖ d.1@`!lg@Y\ZqL RRLyd!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2/ iV&KEg@ -9a@ϑ $f@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 LZzNo94׺IOYBPRyE^& HqIMewK !.whIuA'ԀB)aZ}z%uch#` q] ;p T=Ga?pz21WY\ә3H+;qcCSq_m"4%Bh7KO|&,wMw/5=|ne/.L҉`UɌpėyV */+|7%)?LZ?t/&ޫcc^[Gf~kY) 7>rZ/L][mΎ' ߡ!5m:孧?샷5w~i\9}qoThӜq6 4pǙG<4:'Z-ܜ6(~g-iv2mMG|`z!U+:a'P٥zȚ;d۬>[/ڼz9^^^eԿMwvn 8_^oU"_Moҍ/kѤ5p}>Dq|WZAZ#q־+KoCkd^r f Å0ΏZ+)8|n>?}3{wuY $x჏o7v-;QDv >0LsR9uJ#UҖg:Up2QQ"[zmYE)%DCACd#HiJ3o} zrLR3u8)QNִ̋P>5Wq#Eo(>@-)unn\k"jd܏C~:Ob 1sR<*շm ni߼| ? \Y? 6!ؐ_ÆˇOFQnk1rkXS׃s;rHq<(G7+ ؚAͥ*Y53x@Q DRq[2+b"ܐx*yٓѪ)#Ҫ@"rU2KQ8Vn#6ckq2b )'PrIG" TFGBRV`2Ӏ;Q>piԚ(c$rqs.p*LAhI;Lz~0q}}`"qJZJݐ]4 k|*?ʆۋΠ?<;- 0."z)\&ceZꓐvKM j.E/E^1WYAp˘Ho:\(C{]1tۊVi-y6L7D '*c hJ5` $eZۗBmcPgi0642zH{E}ŝV Ӗi"IEg q%WɣE/TSӝG'H:+h0Sc2 I֘\J)oa.!NÁہ)$}\eR FFKjQYBx[pVoP\zYԩ_ul]QVw7W5ZzYh2nDOMP|>"M'8(@uc b!}i62\V˼&Z>-'&r׫8.[l*~4l iIr]5%JhM׏<Q4)R w]5;z>䇲G.?orr>"$dݗ9W{$뼏8|# ;3nu/G攋Է_Hrj..pA% mzqˆ_~oL,Jinߋ ]Gd e.7O4j''7̤_gźS3m-yjhqH 4ܛr_Uʡ'-mCg]G^"/aN/a07+k(̦p̏Y &mlO7B}{sw58/*r`5xeQFkT2Wy]ѭ }ë;*?g-#IB;/ l . pAWYk9o̐")R3|G,KtwUu?U]]$ۄdwaJK&c-AX, w ջtwDT=܆6G-iV*(u{?`{]W't6P}gg0BΩ1,?┈-rLM[Ru*;E7Hl!F\>s/ ߵ<f%9t!W8fzUeV&57翆ɇQ/u :3ng[[jϿ"#uB4uX $;vHzYIAHԌm|Մ+-NhkNfMݕTYMBKE\d5q.+hYS_.ed~dz^ pn15#LnU2Ja%k;{=8ܽ !ɚ]m-r9_ֵyvU]~Imxx>|IOţRron݌zq< m'3vWUQ̙N^+NW~KW_;FXFi/bY[jݥ^fע*ZJ} c1/dϚ_Dv gxCxaB/gPKL/#y-eŜh#D,3vՇhsk̭xI]|plWQ?nL?ҿ_4*ZT鼧c 89__05uh.:Han5lp0EF3fXZeyjߔjob&DRl`ht WQCS\Lj$6QhWmGkl=Wc*5C8,=Sh?o a?}05QkơaKm ;ckNnEuZYaKAp.h벓`Eق`ӚÕGo n Lvu,бO'EzW%Vلovo ϖ *% %pjW*ibk$J/2*#FSI$g8_\iIoF,?\9F$㑅H>HԔ1тXMNU 1&TX ;z!ώ%[5; ::y3sĽO&oƷ wh5Z`(3*h2:ڀmT 5`!ӄ\cN\SJB\WyoUoZ gl$&2Z)SuI˿ȸE!F2ί' Pix`,f(1xf23Ńt>I;};Ƚ;x'kGS$?"%%48.9U$ L;Fy*ke(H,dPi#v>6t{Kޱ-jb'} iaf5>N,H;Bpyya#㸊!8 #|ZkMH,.M&#VM)w6Zk˙ 4`m`SNrH`I9XO7x5PћܔU?'a\ƪmuFjao` , BEqrrP41No}mM T!5kH®{Blp*ŒFY%2Ie$- ܭ/* syp OA𦗇~A v=>iY#h2,Ogeoզ ͧB("5K`0@u1׼x뛞ާPANnG 9wDߗro94Ж*y5^'IC,L-" Y< MQ0 G!qջ]ָO顴#tt8"OB8bh՛)bI<sJ>D0:s )硯oJ`jtb$ \PF#UX]D덻v2 &%^~<46rD1i?FX5:}šI%IYyi+:\ʮs҇aWgfNhG |̙ Y)ל٬^־*oyL c17]{(OcWEf l[yDJQ&߯M131~XJa ^8 le{ <i,1#PJo73i'ڷS2Ya_1]ҿ1cp?9R=e~w% #y.no#SjŮ*ז\Xbv6ai))$ Ti\ە_?&>G BBe Af)1$@Kd=l >@G E%VcmLNAm0S"0E4I8{&CD{@cLQ)b$kࢱ<`M؂x1_$Yښbo-Ώ5}iɧbqiRB^xdV)ifQ0`&P,6#El錷1.MzZy&-*ڲbs F, 调v8(IMS"*БDMbg9?L&0}BGJhD;XxȤ.}ap.*|+QߋJL||&0d"Q>]V¸lkuxO?i=5 VR!O*bl7Q4*OFO| "Oe}EZNۭVW4?p[om[ "ʤf+3.![n|I8SU5 v{\1dҿFa)ּuJat}UySY# @^ll7 eJ0"G:JVk^aĴfoUܷmSYʸ"(4l%F_$ߛqֿ8ѵz˩،/W4y?x !jZl;^)@J߄leAkGk@Ԓ3{^iܤ~O`u؊WNX&,Ǻ>k0ih;NƖ%]myԂkJB{ ψ̲'F_(߯'b-e) 1zhI"Mc0[Ev 487YC0+66g1Tzc OZ?2.f)a '9;TY]@vB;rDhD 04QhZ(ڌ-2m|#嫢YT٧N?quܒH]5jQM.~|bTmw@׿B5VéZഃS-0V<5bv(yHNe2(:o;cƌL"^ˈiDk452Dl 0ו2T㇢KSd(<0Jk48a]Rrm& Ym4|Ѥh-OiڝصW76Zsy '&g/+ont3&7vl<]]-=Y#M@ĠV%Mrg^>ܨ7]}nkʣ,ys2CQ2,79C^k ˇ~#!箘vxwxwxwxwxwxwxwx;xn |SJqN" 8$ӄHy3&9;Egod篭;Ac}ƶDv@)3K GUT QhZS6hB!:V>ߴc<^[LN 8xrı6O#1Q3]ǚoBk".}*1RhX먿Ha X0),H3W-N G xN# ? \ľl9X\逝pPLE(h;p)mT'R slc2\`D8+6g",]\K-iqp,y gM٥ЏY~s=HQ^0ԬB0ӏ͎FajY |[B"R  TWs͋陁} гpbBTk1}ιKCSlhC[hX _g70{h eZw7}_$AmB]Io%]Vw;Z> _> R- < ሡUo].K$s@J=vV)ևF"r>BRPwn=#SŤ4UfϓbxҦ_&$My; _ƛo$t4$UvӖ> 7+ ;ӳ'пgdIKW(̺p_ROEa~kMn dQ0CyzSaU<"4|.]d *0w\JNjG#oK)l \ CpO!H@2 LZILFVctLp@V;~y ৗ"^>Z*n=d|$;ϝ Mm$viL'TۑPvT4$ftpJg+זj-0av6ai))$C=UŮd^7$p:[Bgſ,Ȼ_7e_"LEË:Lʦ_?Pٻ6d %G`{HB2cԒm%gx )#qF$KUWW7\&|hEĝ7JU9&CadRB . gb?첌W8{5: ^>\H*RSHBC.oo~TGp%dy5poh,B6L.ׅ/wyg&jv7+;`\ K )+JRr~Ey146hhog1bZ%r0yQZ!r0*gerKFDopwwjj4pv"oIVs3[ 5jmeIQ^ro5È`S|FpTd! #c4oa"{&CD2KS"!&"`$# ƆՁL)zDE3A7>G jkF? (1+Aǂ<|cwTn7V`V{UX]<4 7"u)bAya)<&ޥOv~M/X쫏pw 3;..YKط;@i.m͖}?)l`|QD!'_׏ 3΀;\ά5 ֛)W7;ʁ;Q 5D¥P ^ 8*I)HHʐ dg 9Hu;T%w]r.9xgaR;ZugTDb,YB_/6LM[Wk"v]l>ʹfuW-]_:V}΄EHqZU zϞ@̧\6`#ʆ)j7 Eoi)D"^*K9@Ίgt#gyVV+?_:'Ahҥ*Ne0!rpHPddވ!`+,`F|0a%- T)JkboTc'|O\'l1ߑxM(S h㕏S\-&x#32xM\qWMZʳ[V8Z{kǛ_>~O:Phd0d>OP fh/#XHHLBDN&d @e_E.qVqLe r]axd#m4a\,u>azbV峳׃b oO#SjarD>[oii_H1?D79F_ꄨ&d^oh|uR\E,YOArk`Mu}HjīFJ3]ԉ%Bp>v誽cZkzX"A.NIuu[ nNf z?+wq2״%үnx֦z-EuFOesj%q!-yc%fպS&nɶ6~2䖋Ts^M" 5J x?l|X|:DZ&E86Nj/TR>˿5GxSl[/boI'/q*BLѶXңԮ2k(ɏU[ O 7,V?+I䩻R־h}|Mw/ji }PQ[ |4y ܄GUNqsmRݧr2-GoaB&:2+v,EBjT!"]dq !4mAtMMčK.YCH8QkՠNRrBXj0JCX?6PKn"'* ["oCf>e^V+< ͗nv@JV7ˋ m1yl`ht WQC@Lj$VZJ4̰5Jql*"VFudXDcZ)"FcH6=pv']BkVL;v%7hQù\yb>)GOVdwRG6cwg,ܮW]^UZ6H}Dhiӛ]bH6%Eo':A}8E2S' `3ڳ` zb`kMXW@>;E*J?3 W@5+2&@sGEύ_>G].k̍ &-[]N}6`d LSvS0A ^&XI#k\# aR/1",DDꥦĀt4<)c"M`RF~g>6Qg1! MJ=uh5Z`(3*h2:ڀmT 5ѡFx1'Чk.86`Pmu "!RnBΕe.y]$6ɨxX̵Q*g er+gI10I4V  jO)C`< 0))Yq1HI`o`ʔh4hcR;ĬW逰 ߜ @;' pV;+y}h?E E. ,~‚nnl|hlw3g^dh ZbdQY%0,wKÙۉ(Ҷ:{55Bᔄ I@` qe{l$,f~Z72PlJw]foKۯ GV6~9ݚu0ݬw&wA7Y dʭ,mg"p^?<՘FގA.f~vS%m F =s3ܧ| _*c̱|27YGH(򹖜N)hOfxtʹGok) SNUCNa5 F7Zڃ%˧0Ż7?Α|pu6&!}/ sɚ'y- D. u˹y?4~T\aW 47_ i/'j%Uv@P ǩ&~2(*,llۙAbkP`̀%B+TfafN3[K8\qXt X<y,̡tfj0 9+an\Xk6I` ZGK!`2f4j|sWL#,8ZI}gin\q/Fܼ}Y'-nGr 7a~tXl)٪vi 2NSuOڶq_V#HEWZSއlaގ MddgЁ(+Ng >h|>z[`"m99;0}L_WU7 s `'AI0&(JLj>aHgikA.%}OD^f? JX};J1Ħm kɺLkRYhEq{w38 VY#_t;ĒzZLe&#QHYW@1Q띱Vc&yeDhj4BZ";2npGV}fޙQk[]_Vrn3mRLU펦WAnzx;)ڴJڃ:eǶ6Z0T3q |lD3#=k nvy732+On_`fw{7\g]j )_r}:.km_KejRcbˬbݹFEM]fg\g:[u4)5>G,UJ'qI\tWtUGe<2L3"N)RbU<(a9;MviΟ쟡`!katg^UN")sF"5:'F|@m lϗ/S2?+kѕ#zΘV'L(|Fr!8üGq ǑsGuRkMSZؠ i`#THJlB O1jveZgay7 F-6Xšr` =w5Ĕa0:?d.shRJ2,MEZ>^gzɱ_1)Ǽ؇L,I ^m%$_}KZ%KJ*/-Y$υ!/Ϫ|4+uSKmZ(MͅٴCfPMګ "~0"r˚pgڤ%}N.q/0`THp ٵSH,4ymJhuQa#[)ES@4pW[!?46EǢWW|7XIc>Op$]篯O9tQ AF.셶Q?Ʋ"G90E QřqWFWS[OwgK1~k:>OⓃ/ݷWU{]e2pq`q'$Ü$.)%OkbqKLۓI,yTT*|*a>WYO,Zh@@VF:QTH%Rju0N,yޔX֗>)1t=2󭟒Y<1T(/%Ng4emzc[Vž Rѣ>1#Hl'9%E01/FU 3:eYZt٨uٷ);tL{&oD֝ }trJ+%|ť[R8EW{`v sj|_N?v+ ӫyxStEdlk.l)}MotxaUފ}[UOh7-]6Yg߽`^HnPQaeStwexQ4쭥'_ZERPi;Z}&ۑY);r_H<6 l/ n4܍f Ȁ=,EN\AW5QgD&*iЦ V?PWwdaY*7T\L "i4hP` T+ ,-fJ6llǦejiҺA[XFbAlyXs5H)ڵ̶/*_]7t#4%0U$p<'4O` (L 7 hׂQ:&%l T8E<5Lnq! ºiRI"S<Ƙ }V-ֺ?mzSJCzvJ}3E=n\?, ھVs$c-֡OQa_`r:81I&1#a]pr:5=G+"< >졗=SjyF^ '路Pi7˟ **bSƅK ˋ?6>_u b|LJEeR**RQWT8ZZw\O'>=i90J0!eeUf*;Bi"U2]-1~=]鐛eN9fQ{_B$45J>qgdf,J[\L28ll79952(rm1WBwZ}7X8Hv kPv XwIi|S'"Ok|pK &!CtR+eBL,K1M$d Xʔ-r8">O5 ݻ[פGf\?.gp;Y'?uw _ޑU[D5&u[DQETo[DQo ;`Ê-b/`qu2cea|;˖-h,N^w ]UӃW=KdQy,4Y.U˃1/@ݭ%OZ\9BC]0,Am2yG+2B^;hI ?Y&t<4>h'MWqP(mZ&G4}M.Zi?$d:CjOZ\$Suݔ>?w I<'"%I^eꅓ^WctFׅGRo6Ry_ݴE^;[S3Sg+hЍL8ie҃7酳[{i|cRuJ.%%37xlJ>eLc:3#s;+^iUk)ђoщ,th^vM] T[y3mH2KV~B*1S^F55 =޹ Z}M5|ەsЮ(ysүmm6nOKd }?t'fPFN*cbm`:fyt)S1&eVl,պU7"|{_6$^Movm7OFj6>`AcDa.r*  :#Zf6U9L6mʾ# Ri blI1D≛pE#YF&Z dXwdc0JI`6+Ds1$}0TRLdB@8A ¬ކشV-ZZ6h (VTZd26.joT5-Ӿl/.?s_NT/E%ἤ)X^pt"F5G=sB3*Pddd kO@`r6*r"k{ݔG: APʻt>uvc0#̥ T˚_Anvt:LR+oGbYxW;H\v!V~j$]iצJ}l!UnHk=ShYr.;8:_t! 3#E~ - ӕ_iռm,MKy;ݎ>+y)^Mǟz:uvcOo9lx -a@ޚO K`G`MC~zVio;A:vv޻-CY9$6_0hP+,TLۤuKbdK*[_LeBw?HVrA= xFuS:Ԋ-\Mn!)(@Qdn6jɂ$s\4%` w2YNVASХ4MipIZ(I:c$Ql; "imA֖Go.8| "}f{ʴ߳۩Nn#ȋ:Y\xAZ̹ q7/g|`o`5~۟jN.֜M6t;]G]yM 6ѩoBk,5&QK;OB-ktsȅmym +Ox=>"3E +=6mȦ0\W3p ʱ.9} }Uyu|73CR5~<aFNOCw?9yЭD;=}㰔Q\ D*"XBgx""-ѫo >(6 2 ,DIuZ ]P7-x?%df³E Z*mmSQWE$o'?^0[Ӭ3}&d!+݋62bkuܘ=sTSH\𥗤ޥbӇ*yi҇Ӈy7pMr..y5KZ9mK>(qk.BָBƟ|TDE8l`&GJ"|Xˆ^FcD2Y+냉KM-a$TN<)c"ҭbWl9YPsrhFv>(ڑIѻ 7KG7A_s99V TNn:H9Yr |' ߹ٜw߽m2KC߭si3FPv$sdv s%$t\ݜY:xfX$:xˆiK7^mdJw!ImmΊ#ْmC%=QrVR;Jpu䪛ln9-Ni[S^l}Z*Q_Q:Jtqc%bmf5(U jT۠| ó;K6ꤓydžU3{NRX'$N:Pގ,cnx]&ݎ鄷I ѯ@.Op7M_tR܋V-mۏUtow,]|qnazq ~3k&P`d)³6]VazLf|yyvw*>Xr$ׄHՅ[lewlHX]y7I_0=68e gCjS.g!l&H8229:?RkMSZؠ i.QÕ؄ A2 c*5(K)µ'z hy_w<*ͧ6%K e$ &+$iEٰ?V 0C/[D7ApB݅i6Tt1h/ʛZޥAIMFƅf~8"OB8bhݛ)boVW$C'# bD#L;)(P]$lyl(8 kH֣+_?\{mnoPX,d~t^,.I32^.FD{5[Yfڟ$j0j/OM*IZC`|ykR4 +(-pʊ‰qK/Hl}r}NLt lJ|_|NuӾ59VQvk}VDa - lQ8-o^KlFJo73i'܎~yloyKJx-SF ,[k|$;ϝ XtVaߍe Ēz ;%e$:tơYժ\nX\sr҇f҇I҇[]ÙJ4gv҇ݷmN}y}d^MzAB(ug) @7oUiìOabUW/b{˖ ÷;kHE>ג)E 0g^]Sz .WU~*^ׄԒ a@qzhdl- -BSW_b ]#EVgWW{!,>]q[ZV =աK_eТQV̹鐚Ln݄ *b!wVYsk0ʭ'7헒[i[35Dh%ƈ% EiA[FH) XFk4"`G(*2STd\I/?myv)+亙 ;8Ӱx q, DB57EL! E:ow϶^cH:N}\ԈnO^Q,ĸzq{"T;-a\(XtV* 6Im9H0!H!QU.%9As-RÊ"DIxę` \!TcD6RoCNIl "jI =..n|4_4o*M$}NK&QD0|"ڃ%ui Sz^8 1'p@l$h)t*`QTɔOBϫv?:{7X!-e⹦Q"2b1D&DuĹt )G(ABsyk4p {^ eGTX/l1qusEzCpJW#bz$QJālf ^h 'NE"EhƚsfETŔ8*-n!^!0 0 9O$17rS6'ׄh̙h@1FO$@ 闵^lvcIONl :.L^6uP+R3ET6;˜삛cMI4!'Z Tywcw ׆ў1T)b+t@HmO"wE% @UvoҖrݦrjm(}md|> )KXT-lR wW]6-}6 6<~e'ӛ.F֮>ּݴ9ul 풫v֙<- /{ZN59ܹS81vSvdCp^Y)gJ:4=6o4OKi7p9Ż8, \Y=is\="N7_]ƅ`qhV"fVORU7@F%t J2`o^WLJ6 ydžez3{NRX'$N:P~,;nx]&ݎ鄷I ѯ@.K Kr/Z9l/[+N׊fi%qW9~qA'3LpbIn@0Yۨ0ENT [cI"!tW?"zI#23,FŕCԀyGiĦ޵De)Pʍ"V;y#޵,B)H} H1Ƣ&ɦIC˖TIqHJޤ^`ei+DRBHψZcO9cZiS g7Y_ΉG:i냷^GIG_;Q `{6}N}FuBr%a1v9hF;+BBCIĺܡʽc3E~ER.%Ő "-!PAx:9G21 <4S?3"[41Smz-ͨiEidey*NP5Ӽœ7jQǩxR [g'ɱ~cfgx͇*o?*.M}~IRqΦTY "` "uxfR5c+C71.虑0ufO.V?:f8>c);Sf4-gS9WKӜGξNsVNIUpqxjzY~4=UX4>tI)@}/(Ct:|]N߬TjD[ 5Ye24"0O}EZ3.1*'Ӛ߇w>{Qʼ +:tJjS@ hSsUbUPY-Ea, Hz H"u|qRL(4Wer lCc]y8v79i_9iVhlշ q\{OhY] y9u4blr.~ Wqu?΃gh/<۔*g/s;?HY9t2KYuwV %F`18ན+p%%6EW$_?|c})8 PmHNieN& Iʔ\Ɉ6F{t$YȔ-ԯ"u].k@D07 FkGsrb,+]&5R[,",N^Ym5o?:B ><ᬻQto:SU݌r)|E׺5gtm΂GV=?GhhͺRo["m6anf _p*~Z!>v Q>$~69Pe) 0]̃سټ&eӾ%FȂCzR* AUyG4q ^Fk˹4B;X's]5T&2E.hΉGr ,m}Yt a Q-#xj/:NܿQ|;7_{OD;Kcm̪W+6߶fVk#'j_lt=tN)x\RN:}\oZ=L~w=$qU2DJ 8gP)Jߢ$^8682eԹbI A,$4( M&L֠|BD X 6h=s!9S$vT%<&60 6BbRިd%?ۼXg*P;6A;Vsڦ֠^;T PC:-Eȉ{؄zL shlQ^Eo礻iTG&;[e18 ]9DDX׌8~ȓN m8`Ue!*/gBܼo @9+k('F=ll :R bbpƹAG"f]#ĸ8+ H*BZ0F>!%dY}Bo{W-o 'MgV6oO_rjT{GxFŶ`'m ZVU^ rXEt'lIgslgs4g.D9%. 6"%#'QKn4sܲDq))IR:'AUI !=#:j1>1p4i O}ކB [cHb=`>'>J2; y!(07u;hF;+BBCIĺlaa*,@Q*R] Ih b Gs$a:ZU+bA@\=pM}׶lMc/oo.'2$(3j@ALKH&ߢiџlk[nF=O-L#+sEU9TuR~YOϼWϸ^54EC*(Dx8/>Ng>w5sv~G˶iV|%'t]|K(a))jT\ ,&ÍεL#6q¿= oM}Fw8)]ɀ z^w4ne$ɬ¢CL (rA F%+|t{|ʣ&G,Zׄ X2[9Оò6bp+R0劔(3%d.:k )"hGZ{E<gXE *j%r0RH!: ] e%XQ5ck\-r!99L`pPYMwmlqbÌ/٢`qWB,$cl:#0>N(?to t3 ? W}Bx7fe uE\g8=-d)_M)r:(ҾJjhaβoN#dVS5~,LBc~쌮77i2֙4 = QdJJ*TR.]=RY;lw}GwYgT.Br"UWY袼`=]0YBӏ%bg[\"p Ɲznr1}3ia|40\ChJUm.;Z 0}%`?vdh"l֏W.-O^c1qFbو" By.P:ي7(K< $J5% ޞ"h%?pz5JqY\GWAq~PZuZJ!t+{n+X}6⪀+V }@e+ޢFGUztǟΟO8ץ/u}܅gF;UJFXc+P(5J[ .2Էrz7=Q!E !'V^47R&n(Y R`8D`paReCR@bL2﹦WK[3|\ҁl4ڐx@t.*SDौpLd. &X}lo1oSICc2%ڋELzv(tƃ,t!/_u $2 B'3iقǩ7(=F,(\YXH0QBs`)G㜁ZG!Z0Ik˜KKDlE@U> I < im{Y.hv%! )sNJ|8dvd3@L{;n_BYVy3R R@e8z@gфNt:bMSD!M*dN;&6hɘB3wGa,E' 7[v&8Vב)>!f3a.+:o~?KJ?Rq\F_I,8(ϒQd:2c31s"6-6¼$Sgm*`k,R/U;i^mp6=po `:V沩+չ\6hOn@iڻx$ McvMz/fܮ_ Jٮ$[Oeۘ_gK'sы<H8L<8,$1x2K2E3k .]a]1c΋7/o/p!Il^ޠw4H' g^t:&$R]n&n'{|믩Y`fۯ2l2|mt/PPao[lÅem #{4K5NqBUU5o f8~W٣/KPn,g8:BgT3c9f-OfK(d2FIJgYrBz\x#@ 39|c'묋ȇq8ތg67wҼZ)wJRnJ,8\X$j,MY[0o@̈ӁAW[en86+Ͱub4^\AN1n;L'1iiThB&rIm]H0?<2~WĆh3ÚmyX~6))y@EƜNR"3bxHdyssЏ/:}QT%mC3ds(Aa9*UfK_rn# l7O}&("+㤼 ֣1?6Be3OgcNceYAhX g>ch(?ǖ_hڅRp2F5yH : td\KYUg9{2R7g҆ <:,hs;ICY$OMRrShl58}᭷ wTw*cèQᏝr5^}TH?uOO݌cC=5K䧆太;ϵ؁~\]] {f_MD/nt&7'\>/o<^\.͗6fP" ؏ORX gҌB*.3n/M[ŠdAjB]3V$njC$S2 DeCnǟé[Zfe)Q)D.#۪K&.nŕ5/`4ϡww1ә{> CPĨ9!K@F&&WkGTvo}0%V{_'~yhjbZ.#jei>./7̸7*è)5MåYgt|]@_Mlp*Z\-{ZA3}JCrKgzJׂ*BUtsTţ}jW'޸~x Vkb:荐[tKtR+L~Ҋji.RH->I%'[V椃=ĔL1hwd֟ڈ?B'2&C=bS'^o~6t`i%tUr:pbBRސ}4nA4Na )W8BVoj}aܘ3'߱lGӞ1G>m¢ xd:pAjD20u?+nF!X{\"KD!sӡ]O$ ]=Я>ِ6=8/-Dqh"IGbIŔN.9Τ{oK:fΥ{oVS[m޷ؽWjfD{?CDh=vUkKOFӕNVJM6sƹZnZ>Qk9K `tQH0-A 0"(RdP+NBݨ|ZGg ֯V.e7^a]S1U%rbZ|㿛Fؘ͖+6ɀx.dc\)|Da@0@Kɴ٘d=9ޠxP*pu>!q[+F1U +Wǥ\^3v:6u<[mSLZ+ZGؑaVJܭĠ^]kbyd/ IQʚ *ɠY5([ٻ6$W?`]Rއvm4 L h)M4-/odU")J^RݲJfEeED~Ǧ|n58O)3ڱ3 ZDB]$_jv Lcab,2]򚜮yz׽}_tpz<}6!D& 8,k W8)Vslކ8SH(T,@(nG7`.0 r# T)Jl*g#gt&YT[(f|5s +'VHcgR!?=V?+VjX4W'mM}Sn}\TUo+'ҵ.VDA0DЈQB^R΄2kP pJ1*fJo6# S\-&x#32xddlf"(ܚ195cFwZqӒV/V^ɈcÕ_(L7WD*P JB:`t$LEt!"'U;fAqFK^!a iU30!` xPNR<2G0!a\~2dL+ hbKFb$S,W$53Gc9dlVZI;]@(6ʪU*jg?︒x#UІ(DJQpKf(tĿ=X0*X2Ēx sZDla% ,m,ɒXm˂BV#LZў@g1*5WÈ$VZJ46_=K:]gUug6? fa+X` }N}ƒC lUPq̀4<0 m*CA ƙҢ@A(H y'kGS @K$0):7%hLyC5*Cz Z{1[,lЬ\lFA3BA"G5%u|L[fUj Qו$x"HqslщJ#&A/LQ>jc\'2»*1taE$]:S~[,yc7 J(! ןc%@yg/'DOy[&asSymꅅm:pzh?|h0zx=7X`yT%??a V1b0FsJٚDoc($3<3U"W]jF箮UWoQ]iERQW#tͤZu/WWJ[ujԕՋ+$5j'>5ډ\inru;+Ѫ}_=VTXy%L^J q*Q)uޤ?֦c٣IvuB,3Sw/rđ^?@i w "򙥝c%tv |M#Kxg36t^"H#=$D<fE _Pa#P]i#v>,g${FI-E⥽giAx:+lۯevVK GUT Qh/0-5ENiaY2!a]ƥ&LIX8FҎY | \KLy.S6rj94կ/t(VX:eU:ݫo"UVuE oS7^Zf4Nܦ^[LNklvrCum`Wm?l6X@QK7FmNgGZI}߾~ p sM`PQɂE |ʝF+a\ X9nr6u{AR%9 !z$!!$Ln%pGK1OFA)zd&#QHYuwZ D佖豉hj iFΖQ4n>#opq`naV+l޿"p*ޠ[L8Q|*}c8\5g6af~p\ig..ismk"גVPOCӬUyd.&tDC ]=]ԣO/7L:!1kׯf5!,`dpӮ[g3? _^W/$5%&o9\bpVY`TXO3H[oThY[\ B1@-C\#f,(!2hLE -B@Ŝ~ ƮkkʪywmH_n&Y|Xs{ ,&xH$;_zzX,ɦV7fUXxÛH;HԸG'fsZULp]2=xw3(? DZ4E8d*QB)( ?+Y *Xrax8FgN5t|nEZMCc'<&Gi2.5 AZ#E֯sP0{Aԥef99]ףչ9̆ٿ3s!d}"%$2*';BDK\LkE2K ($Eȉ2DE4>( 2O}F15҉"6 g7ƛQiUh`9*F'grxr_9JݎC(k(2ȗTI`%lTVRBq¢!!^{-THƱ`c)qb8 K6*- gd܎RN" QơP£'fԙgcӛ{VNnsɟ6hl?޿Oހ.Ƨ!qa6l`cDNrdXTBKh7|L+]&[s\V[D%8, wZPd!B$>kE%dB`]H"jQ[/+U{CipTȬӮ^kQl=e4&XM!IN;iТX tVta#7 ߿7٩+:::Hv4tnI~Po/xx?_r9%@ SsMb5HhZ3Oq+*Hz H"< 6qYzv唖 +!i9ŋ\B&\)a%$z*(a?{t.,C&.(Clzo$r~(hO`}!" 3Yנm2ĎY?kNPKBRKjtNmURRij:B1 Nua4Rolp'υinySSyk"ម gbgNE]1T rUWeJ=PRj"Sqn L!"ODO*Pq qF{kwkNZJY?_ i[Fy{h N ]PE<`I/KR%V}EwqG)zJ i7?ֿJA‰Z8z7Ēb#F8i~3^/ _0(I)oօ29Y`_z/wwLPDɭ(/{(huK9_Yp+Hؐ\ަpa5_(TA.ʋ)GkȹB%%w lᾴB~/ut!z.툴!7DմX| w?h86 8}\whCDLmP(]>Qǂq5׸W\k\q+&/6QW\1U+q5׸W\"Ylo_Ql6rhtҪብꐁ:YZ]& *Q1f8ij`Q)HDEg$uU(hNmزwmzhn_xZn\ys ;C{XaϮwߏl.u6,vgMwߖnn5}c0PI [OdkAd|Ç7_ǖ5?,`p8&*#ˑpթZRgg vf2;Fa" 77Y 5r7wᏟ-=kE3| 7t,w]',6gLqxL@ &g2Qf$E!RD[)V?{m.nj|.n2k}FHs=y*vlz)j~+Vs $*T9K!p 4cYhjR0),C.qT9*x{rj=+7YWYgfOO^10yG&w!J!n<g*V+__rY2sͲOX)^)hyU1m }W UA=Kͷng-!2Oww7*Iʭ;,.fBQc-,48#)#~>:Ž#Xfuذ#F=3GR6 0A{Ÿ39 ۄ#溨Lqt.M;eҒ΃(N*:}u37ӡzYVʛAQp<-8)}|yPQFR\/ D#pV6(0FG&xN<|Bk>N^xh{XnǾ޵7/{2,l:{Yuz\f) ǡ54r‹7Wå z~(%~a_\ztA3y]~l>s3fpDG:֩5lGֆsy Gԇ<( u>HU X/tҜX⽡I2H cE=D:B)+8NϝVE* &qAᰂ!ΞGt~6G +{ U+4<-]eG{wsd n&j#cP@ XОH+ jl`|1@b6NsIK߿#^C3Ձm8I gJFLqҔ[AeoT!GD%>s%($ܛ8S=djNͬK`t2, L M# cKɭv]%.UT>Vt6''4$D'JI "Q I"%c*#eCveG\&"Z21Y\hYb."2  V|zW 羌jcav(!U}go`7'/qf|{3Ⱦ;ou ϻp겣z)]>?[O͚%8}?(Bum0}/=;/g똚|9MEͻ~~|sxj (wv(̧^'ɯzf#Naޣ퇩{) -P7q~o1QDm8FY﬏q`'-]Jw?"ycjDF;ru$+;ǯj=vҦN2P^ [aZڝCCyG=3 yTlrǴ>BaYo&/)|Ïs|CVi + +XϭBm 61X, 3ݴ93)zjGֆ(~hB 1\Qm jF[EgyU&BQ%#TdPꙧoltZ{x+F:(҉ W.weQ< $PPh"u4m$%L_шPˢ}-U*iԹ\H A Ul$T`J⒐53kEQ&<' ^;{2 A 経OY4QcOQ659#xL8Muv Zh z@f!`# x؄|L4sLe"ۡsަu@m8kaigeR |[7%Go_0Gq'͏`X8ip+HؐrcSsU7Uzߺ죵g-hM'mZEA!K%OY1E*ʶ!ERC!y X{zU%Ff.δC=1ҁ01 I#uֳR49 ST^/.j3zθZK9/l 3 e˼PtpYǷP^r^7V믊we_w(\?ҿ*ǯcLYhq+Jڨ3Xb̄MªR&A6lgˋ6:>t6(,K]FRT@B1DJEh /0c@6(^YC QZjf)ZF$0[Q2!"΄NS [r QZGi^4(^IU,jsGpl} * 9P8JO2']݌ .iT@~l!&jD`B8 <+H^ba^uJfo)B$QĘ9i3EwVݘf[ԭ#p))\Kd)ƼYPEVBHZcL{34Y^ZBt59Niě2hG_}5l~&҃s 0CۏǼ!/_?tp`cP3:>2>ܩ:]1x*l9 5P|TCd b '1sm0,*NK؊GnS4*+`/_N&0[p=T>:@*t\"Itqd[x]_ZP?-eYX6TuTיJfwg~v瑪Z K1D!&5uK!ќX 4TAO/R9ݘ`G7A0-d)sj|L.-LeʬbxÔV&q.۲8^y+Q>Ar݌>7@S Lr(REK*}NmHL9FHts" DhJR,SKQp&Yq6D}ncrrTkdPQ@\8)Q=h8e m/:qU6Eܮ,Xk6Ѕ F8d"SAdqWB #A33,_Dw[G ZyE@bIq94Q81l RK \4( ڽ#lmg`B8TVD\@ǪLŞ.f -xt9|x±ӎ^+4K"ix' Gw }F`F '  \'? K5 n;59oaQ\CIZJj.ʺ*+=0Ê*y7TORNkFUz,x"'WY6]}{"z$]ݧXQP~T}Q]TՃ7-L6jڛKѠ9|7MۺT|].T]^'9;6)X(; x3Ѓͮ=,zR]= J]'+ݱn=+Xî\#_ ;B(շȮ-"(n=Ay8 r.?yL* Uۥ'@W'-%JN$5QNdqj6'Z~pc-D̕3f ] \XHzf1ŒxrJ≓Lp¨3]\BΑL'̪U|FOz3؋0*E<|#TP#qJ*xN"&9d\o'1[*GblVEOJ?N-Ctj^-pbB:*TH,DxtoF-i H-VKQceuT=q) b|eZHWרku_/Qw:O:Da2`]\e8e67Ych,3@3Wނ>@.xe_&况}JS &=j^? ױI}_%%:w2sQ^]u>BufaX Înt]~S` ֟UX*] bS 7s v­U8} SSriWlkp38I}Ij/ȸ|:ڽ4{4zB uƽ(84J{ɍѻ ;!*o\H=k)Ά-p.LA2'⎠;#`qJ V# ;!ZqXp2]i9r}h&" .IxF8w0׫ٷHiPUS~K櫙i!eM@U{>-}vsvn?f~^|Ċ1^J,sڼ>82YѾR5&O63C=XR*GHYFby<.Ar.ࡱ):B{Y[џ"?QCf=?ӟ*s E9 `ٺF.8A9J`ŠePG@2Οڊ?Lۛ-wSupbN'j@;3\T!Kx WxWs^Ns,LC %w P3(ѹM9{iyヘ]YW0DBrPx\(aX1YR;[Ke В[,C( ES,!1f%vPLBBL"X RF\HΔteTe' mWj jR5HtyqM X#'i>vw^ʒ}C2ł1AVڔatu3[|ەg S'"jWѮUZ futQ"]@7(;Y$xlws zXY1nI\$-ݺ"f5OlGJ(#eQ:;0auDgX@ Tz!vPI"mʯpr9-I6xt|t\܂ tϋY챻:j,6WEV1L8 S~1=^,3߻ذt7 =.3;^\-Yb6}sMTJYI6 diWUDI}=W|_<hQew;mq҂oN\JCRXEIN'bJϓcq+aU.iaxCt{M2Xǔ5xVPnq3]yaK^w gB11 o8p1?Ao0E܁mYMN'uUrq_n3,V%gXv0vט=7?l&;v;l&uC]uWz WtV 9p.OHfxP0T~J}r'(:o-"OI8'T~T}Qz]UzFMZ{uulxg|;5=4r[ӫ`wO{h *f.<5o'b%Y~IgZ9r"̧]촧X" `Ee+= L) Er ݇ UcK l_䚌M_ݘAQE Oq䙄BcSGfByV%883ukc69Т>qI؜# *-Kpklvw 67 ;[M})z,YƤ8pSu4e\E-?}lKVF{QL%*יy@:s(a|ɅYDҞOdzan`K'da_tԅ~k<cyMz60fϺF}y2|2* -udJzksR N3D pL#2hrYp8\Pз%-}QVCP`ګa6FvM:lIP`eIsɆ@np [s|IZ'iRZ [d踵ϙޔ%+)Y*Af/TzqRr[HZ*I+VR2Uy+V%\tIKg7co<L F{͢ "ip@2F$CCI/53eDXt힌sH6Ic.1ͱcIa3B %5gZ~.ZN1PUC0+"Xq8uU]9quUpPqݩ4H-+ |QW\碮Z/N]]*KTW@݉?(X{cFe#az5=$?im*ȷJ_(͵Dݧ`-KTccMϗB-KT\#uEkF]r;uUuP)e^r2{N`.٨B.sQWZyP:u+㣷Lqn^ #㪫=:u`OK]=ԕc=mS3RW`8uU=q~*sPu^ܸ*֦c(LBoϿ}hhsP]g_xCT D9K/"&P2oXB3J#9h~`OZ2Z@HUï7Eg++1ow[ׯ~׼>=ׯ~ '/xI1c}07F3/fnnuwϻN8ij|M[,\R5g?ئZYݘ,phGGA=(vSQݳ(v\(؏`-&,]5\҅ZO=,]t Kİ4NJ4\0>@KB`1lTY6D|uR6wX'X't*Tà\_^bp-f.52F{c&w_rҺE>kn:kH ~Mt0d6ns}>ÖZ>UX޿W=/z4sOQ`tY'7ܜCҤ"d;dwD V(HJ<7+ʑeF16l, }tyʟd5/tgEFl?ZOQ 7_ᦆS51t䲶;\!`vy<\; ]#Wo,XPU{4l Xa>ܐkb 95N^W6-y &auWt iz|ћ/iSS)%<3*G gR%֧.]{ʉ菤D~B2w0֕3?X_rG<g.ؐR@ 5&b-JBp{{7_ }Yac7#;r $s:FK֩w'IΜDinx5 K2yzN$ N0ɵd{>Xޑ0: c mDK_ppnH" $ Hz|+5C֠ua֣ ;>C=J=OIaZЌY\Vw%K/ WGkiZh}\O1/ͦKӗ?,yӯ'S:u>}1 &n4*0:tP>;ޚv,u[\*WI-րa^Q6-gAȃ.!L#4J{Mcx_4o'Wף&,^bf`)qܒoUb}LpƁvqi4 ?z1О" \ռ7m5K(IPspc ش\mBůtg,B5:2]W^c$H-Mar=^fۍ,3 c0߿5cMӕaQ?7{\˩\ʡ6b.pˀnN7cZQxK{zuE{}1 X.]01uKRX7&nS{yXN=yJid%#Q.N kޠA@ERD4^G'mohxJ)rJ yc`\ / 9*/UЂ[d3bOm%zR ģRHvL ؞U=qaԁ.ԁ#:Yy4ѷJ"}e$ Ҝ C%bڷ"YbW 4󴬇]*1_xJQK `9FwDSiWYjbI2?[$8>9>E{ܳr^B_J7saoRM :]ԑym "lrdL$63+<@VNIvMZ8*;^2x֟YinoHA̿}xL3.Ӥh#OeV$:]9@;1)M<ŹayD.wJF19Jtu!&}8 >)ٓFAlut z.]C`VEQ86AuOubU&4liG(ǯEwӷ.2=}~xy&b[h_un;7#1ylsVT8U6篅`m16gz9r_iM{XWpٜy8`_2X77ɑ4?X-ٖ/dmRZ.,yjݝlu eУTU`zӞ\_SUiSVОR֮tTug$?сS^14q6@K lc1""=I\eC\;keų.{tݼ[jHygS7>hlvgr8OL9[S ĜSR)Zg EHUo]Y~{u.OSB@A&atPA#䚡ƉPT `Nf:(ʂ/ˬC;eo˦xbd/e(cցƁ2{-m0)8'|x3qh>`3)4JvX6;ӓ} P#)@5Y̾+6l XҢ{0pVcҲXKް${x_u:Tv#L`UR2zC%+A&{wֻRx 27. ('_&e1^WMAkdL0K9vb ^Cc]ڐԳ<&l߷6:|_x<<>/q#f(.'&,DpI5 Ad'ƧKNZbblWEc%*PPaSYV3*`r=᱀Ƭіw؝t֜cAδc_ԶP{`n+xpډsCufz;6~pJ/YZ.M -H23CًLkR%-\7 $";Lcf0Ց ( TgH?~Z{>J Gg im(E2u@^jt5_Q*eTmX`',GDnf6o4x֟ T=j\~| 0S '~&<Ŭ*Z]n(NO1M:Y7 ilm+RH85ρ~HgkG#(  hbN9IX3r̲dH j/o A%kQӦ\!e**9TljJq,Pl.f/_3ۋwof}O>>6s֪;$&˓*,Y"{JBbFPrZ%,RxYHuj)b$)+M> K2F_2ILبJ]Iv&v g}v{VT:PQ*R1ţ|IV[}b7AY,n:Td]˂5Y( & S,SwlBpjKtWzߑ[Ϣl3sT&&2Z/,s* eVNmitQuSHkN4nոoqmnmn|yq588}R6Y77 [=u ?PQ=k0=T677^{/*%+d,:t=+;"XZTJi`WW)+X*j:qpUD1k+Tӏqw%ai˕iRQeqNA-=VGI#bG~KOwnIyuc,o;WF-jC(X1~4_S.hX7jOT_ި`"Th4mRi@ RIvOY7l'#"(V?09a82f604яk!6|r#ؑEGy OZNY:sA u:sCPgnO5CS u:sCPgn37ԙ&J( To誸%E=J bH{ tJZйgzCvnldEs \(2,x |ͩ|O9Kd֮?Ks?1gx2 !2F3χ:9 1Y. #(_$bDY,b?mi&=z9I9Kv L)A'X"3:gǞWz~4yDW.cC."gi9B@ ^Y!Ab &E+"C%Azݠ/55bH)xGJfct JH`UB<(k21.# Ł7 ԭ!mZ%z r!:2zImP7: |@aRsR=fd}aIu#GCZ&w_5w>5hǥMjl:]U?o]Srח}XL!;nդ˗][ .j_Giд&i{.n7RB!ګ9['J*Ol*Ŵ]hK|7j$(i}늠TWW i؃ 75 u6>KeF_j$PTh~A1w:eB麟UX˦ͷY Տy ٷKJɅ89sVӉ'4ad9Ic:v[IARd|֛V[5{1Xr 8/#_^~tݯG.`I/} lFbuD8b ۱xx;1{;&UGVwޯlo `REhnw᷃s ƀuPnv{zuZ 7Q"]}u;4ʰew-nrE?Y*O:r7,/e& e_ktҐOzKqSϱuEƺ앣X{[{}]G*\.IeZD!ոa%-kO1\ޥgZzɑ_1t=W*qYf.syƢ0w⪉3ЋfG'^7]t=~`ӫVV[ߛ-oglʟocLõ}-{V]Ϯm v S,|MDs ->/3SZ_Z7ֻ.tzfO`ͶCjP˦û5zo.cƀ7ZlG뫻@yȜM08BcV~TJOnjtkm)D;[tj뚶7f|=O/#y'#.}!T֚~~Q[ lRЅFaDu8k0 yTL{ƜmFdsiR &, $n#d\pOXuߚoMq:'*+c1RwژP? \o=KjybZ~ze:xYvD2 hq.r# B0 VLK"CLVZ )$QCUDz]C!|;}QWRٰOvY^vYXgix;8.6р@>sBd! ZJ/q) jp\gtdz΍!I.yϘ8fBpkH ʁAe &~ ~g}=.sydkw% ڦXe+F,K+5S.qe}Xe/^4Ѩ?+Iv5(ɠtGJi4(Gr_ثi\3~LZ buT=|XRakW(գ맙BjJkP^\*NPr9JB[aL+\p@hvk0ˈ),iEL`I1!%(k^CRHzc9 2*ٯbQ+CDy u$! )+$'dXo dAJ > #3!ǑDep1$V@rȒ.2gsŃiEL:HГ Nۘ=)K`o4$pdXMȸ V/B] ՈrcyS/{bV\X_5h`2}\-)͠a &9>KA+PK+¦NfEP?(6(4PGcg|Rr)NS8gӸa\ jW/EmS͈#vfijG@b:%l%QJFOfXD9 6kc4OsE21CȊ.e kbb% G(0fY Uj֨_(x,Xm~meD#"XlMZI\FD.S$íL.2r"-*5Uc׾GM555J'm*j Q{w,N=cܕ~ёɜWJ,P) UX62bv3ڔ*f9JjLQ$Z59:y@~eI -CF:j/ͩOhsXAN(qq<#SS)aG\1^#1 @&Q} [\ڧLgw#E1Mfcq4-]Y|>;6 rs݇}~ngu:6v(Cy<)1Ȼ+]XCBoC V/jOet=d))P*MR5ꊂ ?V]xzfX@-vHNO_>ѽ?WoLCqEgc6:uF>v3OWru*~XiTuRj{o%E+a$+w%*tz.[ܱ_%ԧtĄ>h) }P%ZE*vM XCc5w NW^Z8Zl/wKwv]`WV-}gy3.Vnp_~ CyEvk2?l6QEj.*mi1 I\F"e+NMn4? ;NH`B2jR:D JI*R.0s,-wV84 {g0SwMM^W??Z#fl]?ôMt ?ZddŁƿ'8'IegEnBtp3F7Ǝ4ߘ AڔLVX`AD Ǚ-E&X̩(<:snE78ϊƪe=ߛm=Q>!G[w]^ȝJk(DheįnĠ!oiݬZW,*)wl75zo\獖CIBݼ0ŢmgW9ln8{ /`hu d8FLHRr =x-pF\mU}V767ӫ210 [ 7$n{']O嬋@][{h'ΨM-VӤ.`?z+xܩK7HKe7r4_U#n4n7j2&(mRyJ(!'\K1$:)d#OGvi>޹fͽ ?w Q|Vp$D6Z Dk Ii@R^!_4>#KK#)Z=VsP*\{xw2Y=kE{|f* Q}'kL0Xb;.I^(H"H-s$ǣ?#x|Ag>*Z $f+%݀y6ge &/a0rHۋRY8%8":yĊEUWg`86$U*9$%^UnY`>]"4aMjNXU"+Pl/Y-IDTZXeֻ;AܷcPĸi"һ1D[Xl1G 1̼"u ( {sILC) A2k uȐd\KXmk9{?RIA%xp,&\ .Dܠ'<@;*~`2BJ5jQKXѯu gL LFu>9`Uoo1Z.NӶ7'ퟚH O}9AuyLch=?1R//Av&7r\@ϸ=5s(t9),t$w&NR4X kw,Mdy%ZPi>~nnfrwDˠS韏FNla:""4S'K|#%Xv;J θd[#iu!uzN6a;@C @H\PMBc]\B Ix߽E(d޼nލ~km xf+/uaShr3Y7^XqSWO\uVHwSƛOZ6[*/DCbeL|SR 8#^}(] Ko*@UYs!RRX6BtWS/s|8gA1eEFHBPz ,Y[)t>EVM>yjU#>_wxƒ) [Jȳ3JT5 z)f!c:00S{TCoJ&±r̸J8P go8}L#>ߏzxx_ N6yGKcrVGF)d/0 c18g~4O^(|vsiR9}F;z$Z|ps7)EMd+A M^ IEL*BL&Il&9fN Y ?o/ۡ]Gd@wiP+R<ݾZ[ۏ]JygoQLmͿ'p<%> wIgy%zGDϬRbC̥P͋2~.V=k <0yT= O2e *dgb~_v/x(*1vni=2L.VWD}Y.pҥ#2Aa: Blm(4?{ʐ7?ё@Y(OGl iJ2gR%,=\p}Qv]Qw$u'2Se#gF%| 88jЁЁbt )] Vm kȗjaܕ%kN]hS{McF͏3?ώO&,^9MfUG۟~R]m_IMs-pD8.I-![˳Qg6\ ݿcصR\BlM٩v}*uGNpv5(*']idtxv2 ]6ĕFhd ԋoym7}hݤ{g}^mJY8qRD%+,Fo39r^yƢ0Jkw[vcߏK;GI*VZ LdZtFT2 rL+4~%Ŭe'7$eo1<$$7VH-Sܠh-s篗^Dq4+vfŮdm'R;Rj2]4+Vl ha뻿2pwrmeNMz+>>/{:d spRpR+k$$xe~5q6/\c'SE~,V0ѤdN*&rNż$}EW-̊P*Y# )ڔ( >Ls> &Yg956N'qɹwlMZ Z{@n+Q;*,ޥQJFO̖w=EY $y+ rE^tG&& PrpLe=YGȩ&`=Y~f[hcSh{ֈvЈF㕌\ - IɍN I& oޡ**-A׫F3f"8i%rJ&ڇɅ@ Ej C{#{5S-ܠK]b-ŚtmFS=V|ޮ&SCf3eJ2{C(t^`LfOՠ7pNV V(*'P!On/*C ឬ2G,D8gOBh仓. 7ÄX*xBJ Az )k@ȊQJ8r Y 5(L7FztS . +tĻ;16 _3>FlpFp>}.;uBJq)6ü"X`YtR3A*0u-1lV&l>wqc0gNűl'iϘs\1-Ylsix $ ]]л9ņ *W)pxk%6knc1$FTt5l7GH<'ted*Tf8VX9~sGg!s!B`ZAbr% s1 BӟRi(1JU:%wdJ̅7H@"39|7rs d_N;"nOl幯or ]ߧҩT)Oe&0sSb'\cNȔ>$a@0AKɴ%S::d<ys=pHRK^p3& B`$P퍜|'zrEگѿ27޾ͦvh$0ktEYU*WI7'țY*!,ߦџ~`7]CR%ԸajE?_?Ш2GN: R@߸w@ֺ&bNM29o;li4j%/r> VVnB/G`_Ǖ֞>rǖimuNu_OCՒ«7#^fC(rkTG=zwC]iT*}u?0I;U*W*0_FOKmrzj%c/uJ ew)H"hkc0Ry裊d:gUeFK988k%ǽĵ:&]f;JeB<,dյe[Fy{&gomO` ٸ);R_~?6}#ٔUj@%Z6WjpCDڹ)o!?el:va߶JCuLހ|f ɗN:"DbBG,D2KrɛqCImi.ՓذB>'KM7Fw?=xY䨬7ΡRM~3 p 6<( Z| X65cX'C4 b[;6àA1(J|Zs#eAR8*Sܣ,dWoTmaK^ 1*Kp$H'wsBpJa4;$2V))2("Ž|Y8h5]{ɴ-&$>nEqmV||&բmOÑ D$ \,%@Fod%Q}Ɍ)R&)zeMX9Uu 2MfЛwLD+(bG5cfw31ʾ;&) z XS{f0z&n&K߸a;(+^4CL4,)Z%fSDN鵢 ڝ4~[T^x^{*ImmH(:mIX] [ąGyn^P*y^{|sY@5"=itqת }RKVyfL%IU_%Ϥ&KlEգe;7`9iKvDd<}I sdkIVPhd܇1àl! 8Ҳ,X}r4 }e )qB{ ңEKP%P!!cQ [#g49b__XP Mɣe_ϼ;x?~E+K/_z~X-F:|Cn 1:$&\NE/>I2T% Lsq _tFx>BfLW #4at(A+L]r2ܐD$$AHT#z({< U:1Ajjc ;nWRwGɣG#)L 1˘g9IW1hH?w;?FhDAjObdhG60 ɫl2AsZ%:cL`%rvvi'W`|Dd*`9Yis6BYD\{Gx"-T(T3sgnDp6dʼ=}\Yӏgոw9(ήE4_ ko:5În'æ9{K×`?<WT+xx!e'WzRT887&Q?>;OMڥԆawWwSUY')SkGσHM0-YZPir6\WKO9i1|(4Ɛ~xso&[\b=f`)q.Ϲ%۪h 8+߿=OL:O}WM%$981HxLLPEBclz\mB[k/ۻ"iP\=߆g6te/ \c'EExf%LX<5i;??5ch-!7sp.E.*\ʡoOӞkRqo>Й^~iP.}acN_x&/),w{/^9>ܓeFQ2eEpZyN[IZ"򈠤:[1=Ȅ&jf '9Ct]w;aGOGUoY̔Y\9- {j1ccR.ȼd lrdL$63+<@Vwjl ac:_$scjevm{(^;'m9;F2ZθLKtZYt0ҦaAHi:jx/8uׂqKخ|E:_/:R2:Һ͏󧏹;L*dBK3klFWawpv=G-hiߍ.frd 3tůn[Irl qYB`E5V1)aJZea{ǥL\d3CGH0O/t˞0T k[o`26uqT\a׷K6YƳ][9T2ޭ,qz̮zz o>zR{K"Wԉ.UXbȠC.ou!Y}tWVbdx Ôj6X㏺~ӯ?Wl2T呛Fޥ.CNۋ{hՂOd>U-<=o_v@sxx n7G?S6ۍ%8$/\I-b8܍vs^r_ <Fe.rkUh=OG &3TDDj,E 7z6V:B-~xw5jU-^m67tna|4WZg!B`gu.zד" ɠ(d޷LQ(ӕ*WX?Kӓ~~{~rKc8I2tQ$9[ey0—='2Sk[P ȲRޔYreR X;Kh) /6 r }=BǔIeESZ9iOo>/YbvֈC>>p>q}+*6@>f:U> pA.x ;=f\k3'bY6%%S1kd޳/LrtөC@= -MBV뚣[/]v'{=!lb bM3گ2t=1ع;'ۃϧv?8BVUM$]yu&s9a B(m` Ab2`8: "xN"FAUrn &!0Re2pc 8Bg.HK"fZacG/z4%VfO0R,2<3>^9􄅢Kd4S6NyMͱ3ەYK B #)7?}={%mё) W7J)|SvNiD-lfdƉk z[Rҋo1<$InJfC˾Y/rT}FCkdv[S0=IDb@2`E!s%EO,'@-LhՌښ\Izf!`c&$ՓB51;/"hshRDr$fRRmkYW8c[]ZօӅ;ՅjNnu˒V"͗3ߎo~gt &[!K#7V 'EO9dE"nಅ,iQ0^(j Zz-I+h2{̺,KRfض"gA\0.Ekjmݲ֝D60CS`Lg{!HFKd+YsP"جA<%YȄ $+:h"] (٠ 8Ǭ" dTj;5rڨ5*X4bkF4-kDiN#K+aBdd2eȼ,6\ZCEvkvK!#JCmYΘ6l$p}2餹dC $Q5\lY#F"G|-D%.9j1^NoBbˆ<"E*A퓕k)ʟfmmT#r]+/=mpw0_wiФ{`h#_ixcR-F*yF[ -/sB6]bq;.egcD?2qgH D t!g'2UdR\XTG2Up)SJP]+LA[UW`kNF]SQWZ0Ǯ :uՕX]Mی 6^}~o̹e?kԛ 2m6촘Q&Mo,0&HǮʂ JAjx6(cѫ^zBSRDcd}!SQZ8zlZ]:kT28uUȅZuPu^R=bIDdU!t%j>vuUTzJK ڞ" B.SQWGWJ]TJu2+YLa`8bYx‡LKt"|8F44?|4 ŬR .}pb;5B+ )iaA(:lw}MM h09eoT&^ȄeV$:]991)͋vvѷ vv IU2W20Wy]*›dWLKwm$GW}ZcE^dx ianb(f }HZ[XE-,FEF8'+#h-62a.Z?|s [,P'I0C߼[ Fq}RNMd8OVN AçD t&v2V1H=Ni'Wl Wlm`=  VIN`Op2ra[Ʃzy&,y-O3 ҽ,\=ڗZ4+I.O+9s^Z?!b \ޜ \/%beph &u2pՙNZክ#\}p_ly?(z3e6sHmqP;ޣCo}.Y.;a 2酸7w _R>0kyq#y߭5g|ĺDE%iS0VTeVfKYFm'n+~LRo"2XB2cRV"1ejGMՇSº,&<lQOve-=,C@됔2pAiHBM :J0T@-?`r@#HҚw-ql) ?4tm \B<"y&h_Ouqsy;T4e+:$DyJLȶjt^OjmxB!  ",fmYc(+eB>7[SR k,r8bM0 #| c0&j\΁ 쌌7ź(%\ʂA[*MZlYBPM2L$%Jik,T BA=kxD)$h~CRPS Hԩ+PHkD5{PR}Ysƶ ySՀB¿&h( 2RDB@Bi5UV+ $N |u5fV^蛯yk&Lo xngދ q))S++]78Ѱ8=G Ar}IEPV/Vgv< om UtPunP& :UQ٫;(@Vɒ. Hb U!pWZ?;:|ҷ9J>b=^EDB.@ T@@P >h 7 ՒnԠk@)@f D/7 CK 6#N:G!;EM1j75_ ϔ]I`}&m>Р" aRꆠ%C>A`?A. j@!o, 4\-Jc,0l" DŽpU"ΒM@ur"M&5ZǤ=J!ԳpgeNj4ePf929RLďފ"nƂZ7ܭE a@?oߦ\%L#ZKhS㠭}nv z6nƛvi׳1ײ2ě~ Q'7P7jthѣйDvMȉPf1kpk̊Fj9BGբGK64)oG[J3B*y I z XmJ Ce 4)Qrt Q62A Q9$TinreKP}V=hl3Js m'lE]Q("VRe 4iW'A'yˈa O(bY1&UCQ,FB6Y$W=guC ò~(]ڈr1 S; X;uF&a?$hAXARg/Ug&hd D JY~'k:E'=kDvʷҔaL ”P[;ixC J+@(\4JVya҂bP/ZVFυD4SQAj|dNjO=0 JĒRₑ4 p@淎׈ JC~cyOvИ%Un6 X9eT$`VMMo"I K]ฒ#&Jk\*?"t+ygV}OɧnԋBhZܴ\nvL9)a&4 1Su9{wV/]7S:uTCO-fkw17=* O< ]@N!CN'̵'Ӹmt:ކtȝ<鳧 ȉ)2&ՠwM6 ~dYjB.%#6ѝn7I) {¯u o:vxilI\IgĔjW^L]Ovt}i]b[N{ +jGܼ?3Oɭ j ދ77|gѫC<r~uY8.:ؒY*o#sO\yZf{W0_n}iO; _&_Uo}32~:Y7rHR[Hܥ'tv:g?;>5{(om!e77 c/&< bv{uE+ټlSPCͳPfk(*{m7b~|n6ݴHq7ka.nO' @[\PRH-GqIz]/zryVaOoEFsgTH2fDzQu܍ۏuma"?|?Cn6ؙ|n0VZ-Uݝ,dɭeʖرbX=0f]7K(uv+uڏVLJ;lԖu@~4GG6:yH pUuWr> .ʡVFgr0>Zi|kzsk}¹/B?en0۶c:!խK{1 KNh>6&7O JNʲ+rb]#B*uƦXH\ UdO/鍊>#;&쯿 X*2eVzwR#SeN5$g SK ,%x Q7S짶ivs{pO'"|;%G)y}ڕ{;2:MBW딃spn=F<:iyMr)G>HUrrQ-ڢ-U=irvg_!)e eL9sidet!I 팜90ͱ]jGDl9{{iQ2'B+{P;e)ɃBaVP✕UsjՁM+zFAMf.4`jQ*s%y\mŸK,ʂWJ a7(nӲʵw_7vx{rl[6ېD󠽍*o)'l*|x$b(flyLLG8yL<ˏߠ/T`nEj}a8W~ߝ۞'(y$^]>u<M-׊],ko[\qnrsk\#9ȹF5rsk\#9ȹF5rsk\#9ȹF5rsk\#9ȹF5rsk\#9ȹF5rsk\#9os >s 'pT5Vг\C+F[\V\sU VSp\YnG2gx,_%1vLCh;}_K&K8"'^#xYA \)Jt Ke6p0,0uf^1;wmeffϦ_'miXK?iI}uw+_x.ږlkpoo6q8=$=UUOg]k[vp#w6iVZ\Ѵ_'X}iC(=)KstYfw ͮB(@ɦW{{g倚WJdG}X̣~0mkӳ|Oс=nháO]A /\-i=uE.>1"zί=U=e=[Ki\B@_xhʽƎn~n=#oCXPT:QU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:PU:z@{C5}q 5]ᾯ^)vw?.OhF1DEDN="\%^ְVZ"oQTDJXuQxxv(q*H$DXptDqAl9ngy#$F$vp7] nƝ%gÇ,)(Zv ,r0T!Y3ѻw".'E:d^Bd$Qm`TT/WT֜%Uޅ5k(8N[e1?FCE/x-8V0U >_7=B}/j:{-HV4W%lWJL~ "VGR: QUZX2a ce¬HwUDjiRQ\dȊZs-f ߹خAT^m H>\ a5,mPb;j|yƷ4K*=VQp8rJJ蛖d>4R #^&p_*B/q 6209Z&NLf4D%$8THS-LfpFw-/b9d8j<^5w%MQkOܦ;:xu& j -ҕSIz:fT,4.NA6nU VNf o#γdU ^Xdum`uJ!+jgz#j*%U>'XO11)Ig1)r6:0M)EV"֍ϭ[Uk~H OKq@h#|=-l̾6 z.ټ"y!cE-m'լINgDj7>f:4>2\SrH`vvvvQ νT[@e"o)@CEzQш1T'TG8RR!qYVA`5 }NGO+lGtB[/^bgHúCB̗0UuɅ+Xb0MG>Mm r (ȦX7%xUSI\3~"Dy3p.d~vdIs]DvΥhN,F=J^Aרҫ|'ೈ{ EK@!5zhO!6fvkRWI|p=K*Kv"x/g%Cv;^uȝ[qWkg}x:ګ_=Gï@!h6|,qD>WsW%5pOܝn%8`q+^a~ ᫷,ލN"P+[[YF׼K)z/j =NFiz[iB m.GuYSɜD6܈@l㥷d c^'T[EW޵btRź2X.׃2Ev,Ւ.QJ ƉRd}d^KYT cIƥL.xLe-.1\DkV7 `&wL8qZ`=gsvB(s/˟Ze2ýn]?Agz1xل=gSlMѳ RrF3E&L5EϦ=gSlMѳ_APٵSAKٓ?@\oj``€`pU\TQd woL<=wpYvw)QOcr;#ٯJIdsViNS Y9EF*H*%#V+ Q)]k4T66ȍ#&)1\a9YVF{u[Ύ`nh*nOVn+9Klc"[xqvB_1Va*5;?}ܛ>{3N9#;K.3h\AI51F W@i $/#fz]uTa|N|T",jKF.[u($Pyh@jQ,GqFPeYi|츔[Ywg:3TJ&1xRBK)ɫdZo kIE_iښvQ1EmLjHMaWC%Hjkb+DŽ$$Պ8)R_Eڇ雝 [+:EM}0Hn tEg<ȉ|ۑoyaq ƞ,>'Xp|тt'2tނ]&9N{|Y.ߞjӕy .ߵ@ڡ'4'cQ Ԋ`F֮ y}9ϫ>#+Ebߞqvcq>fFx6x+t(i:9<֦Z.X#S jhqA?*,SV=FٷH '.ZM}V-֧@+mPN7rw+Z,{}}=2aѫBV&dMITN&r}t7*k=KM)j|ڿ=m_,&>4qxF-`0j%M3}B `C=Q2|vw<9]*P{-F掴WB䣝\eO޵> ff5)}D`օc-j~(7BI%*8>ųuΓdr`@l 1V^X2L>ÄO_è>ߎy^WKH1*gK 5_\]#)iv.NjcM&#ӵ|{+U=rћÜ9.|<VdXUs.3ꐹ66h ;JVD$ݨN}نRUԘChu#XNdՔ3L궜0Il18>=$㎜|c5}Cn$zϊ-4HMIFrQe_*vQz;&5GVW3ִ`;dK|k2OE#|ޫ{N37)Xe⯼\^ p[cgnt0Y3O%#(0mc H R S,*Խ/H7zպkFJvw-]#vS/}._Svr 1 Blz%uNiHŚ![s#?q)V)Mp0=yh8`G^k9@DL(5Ъdv ZjSp5 6 @Pb }{DR*D9j,%|?D̨-CզfM w>8趜`Ã1wMl`՗-rus8% aS[Oִ*حٻ ?-Svݧ>;9o꽝qi֜{.SYW ?Նַу7O/s2͘-ugnAF+i}Moo|a;#7ݜ~{ͻ \~tęeݣQ)ݹ 7y}CwXshjz;)3?1!Ǐ6ENA}e~s|gQzi~Gogdq;xcYސYծGN_SG穛8֟[BADH&DT WNx.3eSUTrHմ# W*c%rX`&& jK5:r.TMh,=B-3f哊DUaDdJtL!=sCi NjqQşgVėgi1'm;?:0NK?tIΌ$m)u<{>a"˖TR CAXSA՜,A,BOa2 `,kV߂ޘ5: I#Fr4%S(X3RT?9dCy~ZOvcP7!#ׯ+W_4/TZ)+eDsflbv,ɻ8JN({tĬdx˫wy2=kMdy~5@'bE.3椋l*+Pkǖe'4NiAd3rxWb>;'JB>˓JAPT"KedQ'G6~DJ?*}hFgw=CCCgwx";F)jS_܌ gh(ENifWHN['cS1Fnj&ON73NJlNهG.nWokj,Dx n"і;gJJG^*G:u )xdL=/yX*}1[z`i聿&|/ȴ̾~yMў|}N} %-V3|dbR!Q[X$X&'.\z皝{(΢ת1|Eal\l ANJeK՗֏GƤLJu1zMS=lE?%i/.QMTsJ51zZ@j`",V9T*Fm):TfNA(/ }P`(zg¹4 CSٗ/'nD߃ b>-6?1wŬnl^_}J|:?Ѡ4-#f)yKb,rM.LЄTt-3BZ~e[&]O!+_|6rKcw>i}z`%?C\ӥiϞ@ I<%ӎkaeMT(\-/.GL'ԮrdKK +UMt.yCT5)ݖ]Y7ƃ}=n}ܧX+&lVu dR{${]^T2aW=gJ cao!0>vEbCPZEGe1Ԝ[Jt$@UtPLOI1ErEL%Z[n Ybz[nn7g$c_[:B;ƒӵm\}lq5t7wZH#or@ŷUߎ`8mX(YwbL-6HbjEYFئT+X@3Ƈ-{+Vs,R[ bhV[ePZk2ɨǷFA6<=Q6[Θ{"cr" ]Nz8@q1Apï0@?/ad=K1ĵX)t0KrGdK /袟]?.F!AO9?T|6VY1>]\Mi3c |F1v{Ӗ{zՋ#p}ΕEV NBi7y(Z&@ޟy5fRy|}>+lIrNX+AZ "O$?!?Ż__>+FI4%DgdIiID"2z"w\[<0`}8LAjͮ[d*<0ߡ(%8 unF]:uRM]]!LN]}je!+5Q`U&w ឫ+VuQWl˩gĠL\]ݍ`cX]݉\Kx;nr4nTyB];+֩N=5F v@*l+$W{(*S龫L\ӈcնLc}۟[CӣfH &Q1X(, H@|@E٤e"occA $Asy(qR!`Bfj/fPg+y+1uvZfq-z8 ώ^d>$XY\eء,}Hzߗ>(-} !T,pv!^ʫ6i')x4DRD JE hef"nVp1;C>^n/Dr 8C 2%hy(DNr&PaW o+>f5UzVij`ٷT6K_]]5?mLS\y ? K$E *˭͟6Te?y_X͑鴍Z!HU+],o|+`Rg;K?`d*?hzOdWC=FYK;repx$@2O(746጑/&vQY9"Ӽ3[ZW6=w_h EE$ [Q/+VzNIJ9I:`#Avd9P-i wS O ^K$&@*=N'P<ƫU 'n%x4j <_SyH<<ݭvzyHIVQ48@GRɒ%u܁ ҡrR2jh',FcY@DxQ(:IK @4)kxc{ (4|5)Ɓ*D&m.F%A [#gc3>L m^dJzĥG~<uK+R8[w=u?ŤκBN h QK]h% EuNG-OhZAEx/9Fȵfe1p4eyiLȴ^єj(Qq%@ uU+BƭjU5j1b>nu/Q %F0 WBI ? ^π+hSKlc'awXK6r(6R$hS<:t\HP*I>m)Y Rt'\"dP@,c2ɍ%""s,j'(rK7hwavjyU Ÿw Ow:ήkQf5nGWa]h/~l]~/,YFՍ ,^U&tpZ4[f~}~.;n~:,>mjp1#E'ɟXվ6zR g=^?-{5{N~Gw(+.'-r[$z3'dw8WԠm}}QB9aT+a<unwO uhJPEYbѼsm?иr.1 N}{ S2asN:c5x y)t*MVIhtq |}z9y6,Ϧ(w]$ cmw]'śRj.O;oӉ-.r4_P|1/뽟GJKfuBklFjzyT4 ?g[J#`Z?!K׏"hF;KB*PR.@,URER.% 7$i`uHK1R:9Gh)W1l*jr))C+<IR:*OscZ!8ȹݬE ].ho.O0r8?v{sL [Ko"~U"G9|zSwQ冗EU:T Jfw𳃟S9ϸ^54EC*(t"c+ Fg1L]mjkPȉ64G4?^ ʽTAYWg&9n.ȓ:jyc{&=ysޒk)YT*w,ʚ$OZ͹e< ¸HRVhPFMJFUAhaQ p^{-TH14փN)' &R%c1rK(,g* EaY,ib8+Mec4OӐ?Ӕ;8bK%8 (LP =Mb`"|L+]:9%Bø];ڲԖdC,+0ld\TAS-k"Q9ap&/"$R@+:mDY"(i/8Ɯ=XģQr1rڨςR$b18T"Qu8xic 72q,R8(Zl' WdB7mE%єm(S<(`)ẻ-)b[UQ_"~xq*%:qɁrrQwr;I h%K|T,y@H`2<5"hL R\<\yX;yI*pVW2bDM5W?Lvכ2zzGY= +c*˾ZJR氵C*CM.F&Oߵnu1ar07._-:Ѿ 5z; $RqqHڐ*F)x-O*tA"_I1E4?|qgZ)&mJmՖwl^s`|.-6G1kzenx"N6gվ7-e+b=[ZxZP`ZХ};rkÓbik,Fwٛ2I_CzCEbo_nW3@o-E2r["  YcmȤTM9f"; ׭jf:xwtǽp䭔!va}׳ԳB2z?]ݷo$ Nt0q2cz0+'4*~̽ Sپo*]{w'/f+3%o~D ޻^?f}wKCf3 Sc ՝O;Pܟ`;*[ mɀ3q0@E[l1N~~شkܳ/xN0l'yS.Fh'Z?C5Lp8ɮ6#^fс6B]>=m=֛N s`G͗yhE/5xz[;)>|܌*E #ՀL3J_:Tف5^ jU&WkWH-#og* Wic,L3jTщ/F\Koճ'##yճeԜG\=J0tş!x']zʕU&XW\EE\ej tqT*;qµ"qt~-*S+ťLe3z'^Ҋ?5V fy4z_?sn˸: ۢ< 7D+% EYYE%q2ǭӔIꨴ@gW.(Kfeb:P|ifeR,3Jzĵ;-3˾vC?̉aA_:N}- F BBd\JXn?qtKnuG EuŸ9ly536#xkTVǴ`XAs"2f BQl& sI/f7GMr{guoږi5oOrL_% =NEaCU  p0՜0uWUU%骒tUI$]U*IWl)3Sh?1JtHJU%骒tUI$]URT:!ʢnzR1*̮D+Ǚ$%LOшAch9/r<jhxƗlt_ QH ȣEdKNR@AR &RpHB% *D8TA҇*&h:QvL(IaMb% z᜙5\ZL Qxo չ)Hh(FKt4`bg8x A8BQ;̙cB8O7vo).J!on^Qp@9mgs,D6r"<wl9g9sLylzܻ;M&{0(dsVP슥v͙`+ W2A%T ] 7$~?GbZU|1_R*#-uQLzu^mC]wu ók(Kz/pFNZUX9(\xNmS ]\`v~`F=+A;uEKS?6Yb <* O< c@@J$KCF!c$ ET6F)D;MM]9C"e`T<j[yR9& #gA ءS*+a({>v⭵c8G.'O>j3ǙL z0tS:{֚ȝ-"Az:Vo Sx4Tغsjzw8Nmv٬;Ėfھ]e-znHOw_X i?ӯlY{:6֏OwŤ1lXcMww4?N|sgu0PeMzB-u8Kq>4iDN!9(X4+KG#bS|HD"ZƳib  %p) bTmAL&ʌdX9S"=&b {&31rWOR$Ł1dn7jx{]J Y}c-퀏r49;ùI!gZ{H_!){*Nvu^ݯGs֏vI |uC`ߍz?I}ľ7wE+8-[ʫwlAWkBS"<͌ޣ(K#aThv1_&x=?."_>ˆywTwv0d`۪Bj^]Wgi g7fs^S>cV? +j@#alZVu~tz/y2߀tF,%&Y셭oU {\'VgU8k)M˂LP~ Uu8PW9$-ic/۔|+Z~5;絾ҀqmE쪔> 01VVJKVj-͛qIlZPH4U&)l#ڊ=Z0@N(jk<~D~w~dl=lEdS*=$m4"ٖ%YRs*JltJ2OmO=@f2qPo谾AmYxz3gc_n%O:Cd!;|A@F1E 3JG(EGIaDp(;jZj 89*ED BfBg >vAD6bڅMD|/Fb3lo(%I!Ģ<& m6<C6%o{p;tg'w`^4| (Vf1L[^|OcI eہ3*|r]-^y=ρ~DgT1֠ Qh){%D9fYJI1C5%-o/o'ZaQjOo4,nE.kޞM㥓KKRELVZ* bV YS0@jpi BLAJ62؈PTmB(WFD5pvk<;,lg^{5X=lS,m+,K][˺]d_Pj^zYm:e_/ /߾X5S޾?Tw>ׁ{?]3zm=~h?hWIu6JȨջ(G^z=;Yo7^Yl/^.*O2̸H{lciԻs{|?@Z>p#^{lXz3iQ}hq7#?.&^?ߙ_O'~eT"qe`Zaŀ}nj8ʼ#tdN#i/.bZJguə5d:Hudס̶Am_L>9֧:h,_C#g٢؊ωԔ"4&KUDJ%I,ljKΓrfOȔ𰷶|>3. kHK%@MXk<)k1xbȺ m\CռXn 8dWʨ+Wtdy/Bl͆[QY5)[yZYnN%ƶZL̜H1|zlHXT-fYR_l e+<\d6E:{#$t֨t8ym (-P%kt*+d]Fy2b> 0PJf&, Rpv#c; IơXhc ֥וoF8{wgnfq/Np<4ͿsήT LX@gt&倌O%`) 񊱩mo4Tckq"UM|taT v"@QWb.":pv#6]N-TPtڮ1j5"&d³h'$bdGXYkgׂmYׂKml#k-dfҰ]02֤,b%J!:&CH$"LZ4nxcG?KIP~1cN(:S~s fCWB %xx۫I;w'hc $':f?bPYQ"QmF(<$C݊@2Kb6B!+ a^UT99]ZVM|JA0`WY2ZׅZ|v[f׿);`%0׬HW۾2ꥬ\\ͫ2yΌ'uMޢ:jDEAuݗ[ dhD4RV^KAd +eM)(4j"L?My$N+PEX?lR> Qno|X2滽Z7>v;{fo#߹Ep>OC}d0ǧ>?lڳyiG)`1vv5f~R5gc h,(EȉQe!iDHz *2aVD6MC2 C),$ʢ){_!M,%$=3@Z g/XkSi iV [n7P.ϧqj(Q<%ڹ8ؑqƷ/r'/[4 CV1v VS),f2tu;p|G[lb~> RVތg{/wzV]d4)}Go|󱙿afF6 UbMI19> +0PP8?<h8ʠTS t9 qЗ GYJ@`9M$HƲ%mU QX+L/']0ġ cw!fϖ/'5|>OKYᩕu;B#!d&d  QȦDTkc>L?n&qSf(x5wP9_Rhg31>!W0(R[E:$Sd(e  f!2CWEQffgHB`G"l6E/ꗤ5MJ~3CݔFXyitt}]U]HD"L=sEVKK>>jd:Cv~vkYjߍ8NU}q6ή[Q^tۜ*o\Χ-2};n'?5{`{ٿPTՍߍ`FM/ܚ8v­U~N|cx85u)ǷԵKi˓5~<ņ yѬzd\P]]CbE :6,Gӏ sLqGHсe/?:K-p.LAݙ:;jжʾ>(KrrǧO{79縧?c[uE"䭣Q$L3(Qh뇫}PzoY(d޼ގy_SfYls .7-%3yaѪ[Mfj~wjh}9oX8hꞵ_p*F}Z8펗=\!+ӫj8sȼJ,M7ˠoUM5mRD CZR* AQLV1ڼ-xw[~hl @p)S S+js=tAsNc"FE #QHΔ8hVx t2\g~%\ zqIcJ,?uv}.a i n0kg jIFˣ(,D r9*=JlB9&:o>.?AͦoLd;~J帣Xgr7&/|nsDČ8>JupTsW RARL=7_bܐ; L iC2JO ݆\Sm;vȇrg|ry:pS]yq2ca~!G]6:y7~l8`cqa|p1FnxaѲwxVxxy\_+gg{ۡllwTi=ee=v?quJg! ^dt%K`+'7 Qu8Lp4#pzHuLG͘@V+e2qը1t4FչcT21C3Y*D,N@!9PS1UWl8{u+( [KpZEZ廋+~aLԚwf{_RO5,^(*2c jb )@RNBFk}P-l@ B^2agR I&.Vd= gM$ ˆM 28垦4gO//s蟮ɟ>7Lh6W88Jʝsz/NlN/ߝY8| e,|zMO3$Mg>ݝ;R4AM,=ŤuIx~i$ccu&g=trRpbuIiBġM9]\O0<znႆgIF^ 7v^ 3fb;t&=e4 g~6]S*ߎ?``-ݦzm \#/?<-FCUt~uᄎ#t.gםyH]"Ob+1eDeg&j..'ۓc3eţNYŠuQœ+Vdmpf{Rk+41u 4 x+=$ġ'ۡۡYd 8_BqFI|Eͥ?VSIN@R1U18 }:zU8 /| ecU{RN ~.:?Sz0L.ҝ:u]KUb<%᭪1@\$'N!IE~0O]1p4i OUb٭G>Xo㔶Np{/䞎)~6jMpP}I(aR6/$R'y*W=+ UJ*x Pҁ̥ER.%!I;DZBHur`LGK9'ت8hE@DB sk[*|j}k۳{[~Ǡb>)G [LFpdyG/z٤iE-icQUNUe+B)LAAJxK"4EC UPՉX6g Ȩ3sEm Em~T]m*w^ԕnGӤ~p8ʽTA26:IK)`ih FX{KerZRe-P\Vz0`%D"P^q@J@a .6V 8s +_}Mo+,KE=~VƯPYo ^vٱ2_xw\67ɟ*I^|vEyBM^/"O|o_6Z#;ҮxC.fYEB#gFzg8KoW[w]4A]: ޵6myX[dۂ6С^D7׋dqŋZ\]6;٤n5`?ZB wmjYWs28^͖CEqy64~/xFi{ &S}>uP '6JSiX%RT(+S82o{ZxnAc)*D[GC{Pi6&oEa9.֢Xu({dJsv)]ן( 0%T R8NU)#(Rxp1Yo4e3lg̈U(x(%ISC!ьKn&b-- LհW?C2Turh2I2T劷> '.ʻxR0㠹E""A3*i"2:cՌLH(=o: R J A42nd,b+P,P Xxt¦%*kiۋ{Aojm E=;R!,"W¢q:P!(B4|5OJ l̯Icdg6^  :mGML(J>\ӰtoFl7/]luڪ0j vkPƥ.Z+wfyRiNЖd BҀ'@ 5C*ЊE kB$>+J+Tp!@<ըXa)6x 2 }AbPDԅQ8 Rw۳|Cn:/Oaw39^vsK^Pb}U-)bܔ#M֨IpOWa2r9][o#r+>ىGE8mxD"jr") EQC^YfOuwuuUuWx4?z7J(5BDǁB{V*4BJ5]Dl:gB.}Ih ʻQ@6ʫAUV R\_hW{ˆ=߿g4x_dX!8Ri,Nϼ7?(cL0a>ȅuL0AdqWB #u&I7ׯnV^hKȹ j4Q8pK ~tqraT;Ǎ4ᣱ&B' cmTD٨h:&mkEv&k,6`clwjy5ȧqto!?y8vۨjAW^UT{XcJ5˟~Xo5ֹ tߏGO~7Mf#-Q{9z`Ro?n9\=qFwKCҨU #kYwI-nͨ:kv8Ϝ(V/EG }ruZb /~QkM`SZtjvVz cuQefՂ[8bBzo$WrlZ>oG H=g(a{ʭ 7 xhyh%Ö(%C8 r[N|#URqJ?,@! RkȱC!ut_!,,@m9W'LGu ^a"!mqE &I Aȅ^z (_u2 eu`C:%%&.CtR+ePgiF霩#l;JrkYYlLZaL>=CL|^=?ôUtL25JuJ)l_n2Er=-SR2 -S,lɈ+ Wq!/WH%bĕqԫ7W"XɷW#Wzv<*ّWW.=ΪW@~Sm+$SW@R;q5+flɈ+$ךSW@R`wW(.;")MKdKxX#.Zx#Qܤ?|?.D|B}q,bVp"kZ:Man>`7V׻Qy<ybG8>w D2l`hYZ ''.%\h8AxgˤbbjʔȊĶ!\[#gNJK>?|}a^x)VyHCȥͯ`V.6,Ze;؈zԒ 6+gR\hA\`I$.2|)h4'$^0)F@]d߃q*1%Ahtnp"єjFHX v=NJ+ܬ[1Yiڷެi坠;0`Mށ4\b$+\^ƠH`+;G_he#EoBy6RdZO^f$gr%1:!ɲfFlmH{ɵN$`uL4@W1NM֘:'̕'AJ>Ϥ&_m@;yfs_7^s~F%~ze\0>$YOb}n2}} #P<5UK[Q]G ʁv­Usx84u)~~Β&*5}+wY?7zd\P qb  ufmO9MH7e.\a#@ąy][X$^\b);Px[دȥ0L;ܽ գ}j qPHeeU?1ԃe+r:A`+c@m^;ϭ i8m9):B{Y8ju+uaO)J5$e=6r ,Qzkbf<(b8'S4K!xbBXwZ-o}GiS4>JQg>#t@(Gŧ_} )K*(SP ϙ+%L%+c8rū+#GOR}F 1-: `g0+(qζ[9" 1 242b-Km-s!9S1S t,;A]`-I2q:> %Qu<(pS;"|/Җyi(k㈗!=m[ ۨ"t :س8l`.q me1]ǫM?Mouൊ ik距}F҄>Ȍ4+>Gڈ^ m3)8Ϊº1޺BwHof>wz.f/wA,RGg\>0|kxK8T{'8 \Htk9#W$.쇑KNٕKM>\ 5F-7Sq'[EHEB8l,*3{g LG(',%gN#xeDO$ϗ&yL`D; g& g0SPfL4@wgOq*,Iv$. /6:@pXr'ٙWQJZ4HwD | 0>=c\qYaY+aJSД"x2(޿|g0p+y,of )w1]0_FxX p{}Fyar|WJ)N&h\XHzfXDx,ă'עsydQ6ˋi骩@w`UR8>bϓcq+aV.ՁaxCtN"C(LkWyCz˪c'7 %+~meQv1#AB/jAB. 7yIB Gc-V^G|gWxa(tc^ЍyB7S}LLsJ|(lbL4pj aށNXɲLd"\J8=["Rz͂,2202L#~|n[kUM)tLE38-ܣw鷯M_ٿ^]X=FJ_ #~ XY5ΈToB!|̠PR\p:8>=VSe"T)3J<$kM "-!PAx:{O "R[:--$wmI_AΐPv;'?]RyJhR!(ڭ߯)R[CT z׃r_Cӱ Lyƿ6qx8oL4䣣/D1#|y%u#xd{tNz TT4 J+(S~,1DҎ~vJ?qN0crrf I%6Z;i .*Ӳ.Uꕐϔjj+j mzQͧv476#&.u`T_SDžF:X"FN} q_WL[CjӘGTc,6ʌ ;6`mT KIwPɬY y+z?. ~u㪬m& r,_ȲH#یo ӯC vG(tઔ’E0( B[Π6 Jzpl`K;ɒߠ Ð,!$r0bN3Ctʃ-)V V\j + }@ܔݸi~n9omD& ̂HB9UaaR"2+j6q+jFxɮ@ &0폥<*~Gi(F{"h(J 鲗3 5$&(,hJE@:rK%`}{c 0sf"Tnd&zdUaa+X:,{iMbKӚKZWjw^5~DF)W 0(Y^ EdGFp Ab:TM:)tC2LJD`-$& fT^Fґ0хDKm?cb jgȌڢCt*v+0,"0az)Iز: Cj`#ҀqBfX)I8`%{bp6qamOc$`<D6?vEDe"eq ZVA-m=^%g/hMs5.S#ڍk?_O8L{as>c,ժ6W`z5{'0 &$/W!X{Jo3kֳDZ%&؛QxT~ϓVJa8YT cЋZ2(HRXX@ի8S>53;9oɛ?+V@L poEF[ny39M)_^ -=hϛ~[oM瓓ꍁ &5[4f4=>X47#ZT4M·YMN f4z4\-]=c5_!) %W^+9 F7b!^ڠLiL )&}-Z~h擄yHT{J2X+DOd:"8.9&TFc "oP0vY4a#A lZ{1[,} *;@'8XP~h{ncꖤsNgf_+jmP&gh Kc,}/,zs/cqĩ{cGg.0 mwUV\;X9]s$aMq֖R2|%gSDЄޑ$ 3ŴPq_7zjy\ _T.>hbgX |(v|Ɩe#s~Eӯ;GP<ٰ} 6a|_]A1riyv8-kg.fl~\ w%^$IE?g۝N7wع31pYcIȣP@2dN-fpetN\J;JBH(@G5 jc/ӥ֝T2㺼W"17]bҥVaXTǩ^Z:]em1,\(wp)B8cF@1g sGdzYKg+`\wIsg3.m]_?Ow%bCazmͲ VH tDI`F"B$i%:tW$e MBM$ӡi) ѥK-Z€wprکi+5l7 "/^Ec?i&)~KhSUE!PMQ]WkO&P\DxўsFb̻Q1jtdXGTRJm ($Hk#XH Qu40{<0I+$)IĊ1cx~\Z^kr:=fl^oZKÀszh)~ݛb47E=f}c~Щ\?<{DBL$.IZz{AրwGuc$X \%iG]%)p)B20\m'zav\m'W| \:IJUX+v8\%i>tJR 7W#,L$.;v*I)UW \QL=&cGd &q5>i WIJ&\I<3AË4j#k3Ҵ')m s51m ~֑5tzI!KIN)J3.Ql\Ѱ3| {3}8vBXOp'hoooN~jsCfl{_N]Q{aYWa* `Ѥ+^K$2aky Ɲ(k b5u2?]ͥ~Dj5aWɇx_N$`_NB ܪc ߍLuO]wP_aFUlO"m&տ}Zm> qB}|.z9qH%wzTS" 3]4ta0e2tf 3@12x J5cҾ.Lǩ 3].La0e2̼ S!&etl+vn?'Dv'DC;!g}{: %JhCB9M%Wo1dbp/]i761u[B^FXBXY i)#"b1h#2&"ܒl?E#k\y=~:yD?=3N T0ǓrU X99_h{.Jf-"XMcLn3NXapfAQBZz( CtDHs0twQڟXP건QIq4 ZpʢYԄI# EIYs$‘M 8vŵҐVzUM.PݒPbu5IۗKw+5]Ҟ-<F+#Xsfq-Fynz 5ZiX1v8^9^Q4*wsJ.SBT 0GDPOшRi5>&tJXYݔ  j+R۲B[/mU0~0Ʒek+SY-s p{J&(~pa#_Oa{p[|s4D;;T}_sq_S70gU.!WgnLzu0J53к70Bf*9_EC 2s©- ߾/RjO Ej˔7ޣ2 ,=Z+$jw gJ 5uΗ_K'zԻm ѶEr^i{F+:-~zO3̭ ͔1ZU3MTUWvA~ŋ& *;ywgd96M]7۞Xpwqzi uyr9SR$8%eGÖh *W31"!EfyoՁ9wܙ!7DFS Yɂ!| #k9 E(m<:tu21 y-EUW#g\k>?ĵ~ Ԯ(VZ뚗Kf(xe5rkDz]<&FZaɁ^p>)2tZDV0 e&;a$:}Є 4S,)%Y1kVuªt\ʥVg9$ IH@Uoǖq.m;cL a֝ʟƣѣ-ܓGRNceOc0,F!Y{/PzniAYx7ր@ad:یtd\K16Rr s ZZQ%hwN<8MBr E>270Ԥ%w wt?=,xXm\pn׶hV|`}4c Ͱӫ|Do8tiQ4xSs5wNPנۧ(~OWmŤ´Exq3 `?9 zS3#ƾcwqkctlT.okGǝT\^/χqg[õx|06`L9MӨ$߱*7r:u-\fdۻ;2X wd[_+%'vrf '8 ?:L8IChז;*s $#TX'8}Pz-]4w??/2H<-MUg^nhU8=20w-iQ=5묐rGϗץY'@eW9 t[4E \'bt7/0K).a*K6`GR a_8ɲ "мE/BK>t"R+M=>ylo%Po?0jQ|/0*͚҉7_6t\o} U[9k[\Ǎ.h}?N#ωOjBv.n |qD[eyEѵfX[ݰv7:,8onA72mA|rW]"r:u0xןe3:ɀqpE5#`q+:5:C^iq僯&}q"[R`c8@ǾC4yX֮4yHgqxDl)YsadBee">J;QU*{V* AV79ynnx~*uG C4RۃDIf$ IELd $ O'[Ex16ik27Voy4ZڧՓ|ۻc#`M<dz-y][7{Dgi| juHoNˬRF!0ҫ0 q]uHz-+9'kQ:y`G ȳtD<d.TDbCfuQT0N)A{ j}}H뉐V<3{q9˯O?a8ŵLJ6.;ۨ,r.B.#@=A[k^TwM9dJ;X궸&X!.KLhsn5dt98==ǀW*qY`39rA"jׄvuqZ\a?,宭ݺÀ[@v-۲[w1 ~\[^»=1-]ҭ/YN`B닕*Drx .cQw+عv5˹umWݴ_|ss ˑWxx˫۫.~n7W 1]tYÕH^k\Ny뮙KPZi>jy\JlQ]hbZnmDRy WK1sRWڽ ؈/shAGF:ύ,1%?r#HkdEt5 ұ#3[@&3+̈́56τdD\HLD\pu]RG*}`ΠBZ̕jl7z t3ЯCZϝt˛j,CZ*<6L&tSkT So*U5YfI:̋Һi#tV `l2xa~zWCg;=Z.xIXəO!DIrv&Q ÄۄҀL۪LjͅLe-5BQJ,'jt3F%ڀNh.4LPGY{Ӈ4GJcƍ9s.e!WLcqd6{J 4iVA[{mx6~pֻM0_6!@L͹>D6t,"Ynlw>_Ļ| lT~BN7 M$xM#msi=g?~Bi\R+LK"CLVZ"b $bP_Ju %(VR0hA*a2`REfsAeY#C3?_}'NW_OlCx  ){vji^Fp .YJCkgJFIER2xK:{,S $:C>qI۸[+F" U +Kp5rK8;̝)l嫷7oSt޲|*C;Z ϧ-}.o ]c.~[Ԣ/~.Ycys̚h_II/i'hd%~L<:G$!=74VZN8JyDNRd-}W2X)B6)<?~y>C11&2LdZy7 tETT2e)Cp,K#\fU5!JBnHiIgGʚń"'!Bje.Tjlt6m#&ɊNӻ8GS5>٫GmTMTM^5o‰j/d(E9:)QqR+k$ш$x 8KW4ϜF:\$zR(6f"b.g6ECd 3)ښ95c=RMV}u U  o_d&~wqǺ>_ZO՞lY|ոdO(YezM_ʰi%C`m)K>A Kdt46*Ccчոc_}x߬*Vϛ`' V=|4P`(UB)hyDB 9XB\*gF/C=UЎޞ?u[<޲#y-*ok ^LymO=@"?Q񽭇c!4(lճIe+:\RوZ䱧*S~T6Vr-/ò>(s)਴tHZNgt 4Znۍg42(j0K$5L+[5Q(RG~?L跅m\`m[ןz"#&$%YljW1% zr0fWª;Bjvp-p7+آҶhӧZ:WFt EK\Z OpHQCڝk͚_UU.7ZϢOb0!`9Bϴ5,Rz땏 . . `#5f ޡb~|{;- [̏I2G |`G)Kdg>ً򐥾ut. 9]olO$u'"H -"IcTZš'}F@9#O3W7D感okSjɅ} fE;u^\'Wow '.N{'NܡML&&vW96R(׻VwZڨPZ<.c_z7qG(2o6Rw~7>DF"h*u>(:IK+ҤMrU.ZZ"EjM!F \h|q;cp`8eT!2i5e 6V|~1S+~ɍ/h7p>>\su&y3m٠wxH6*uR@FcepZꢐvK'$Z /$+hd! >|rw-ZL\ϴLkjt=eңYJFsg-;(1cPȸ8ЗWVڄ]⠕Am *2AH"VNgq]@:RpFʡHk)L:$)wJFLEB)}E9KO͖%x(Buc|_z0 LN6 q[V_й>wi[q++HWI尒?(2.(_Qtw7ðpo1ޣ= 5^)N#j޷;:򠬀fw_Gp7O+ӵ0IdNYZWԠm}}QB9aWr5ЊW0[7|LC[T-5D([GH87D&2E\ڷ߳PQJzmC@]xMg@KO^gmͥi _Cjp曥+u,(op*ľ[!.E74:7A0)"ssǔՄJah M^B#NxWO u9V* AQ}bѼ-xO[Rhɹ.J4)ztAsNMZ%IV0DB 8(gP(JV.8M.!VC<\D$epR6 f( n/˃(1@#֢"@Ζ) qT;*%q8J)mga(XZW?}LB0B$ 0!?$F)%.&7vL#1+'| \P 2"p1t4FV/tb6.xpR!"e`u< Ɂ0OiTMaB7_;`4>ѿ5a!fa;> nlk5\9Ovӛ=ȓ=[:fNǓyIӭ~g'wj@*&~ҙ3i>eWߞG8\],F's4;@t5@6K8Eŭ{O(?>Lr 76r??L=[\sRhG,ӕRZ[k.4Ҵjp8t)/6Du-!lC-6-ȵ+S (Q?1'ǮJ:۶.p]דVJ'8[ڝ߫:fkܺ%fwb7HAku$[3ퟖ\T5$RJ "M9)]K0J]>miᩃNuAWFYI6N| "}2hx1H!+G&n`Z$bԸ!-L%ʬb XgeQDlWMԁ/NPJtxdR>7n> q2_|Ex9ɖRJ< p5) J(ո JdByvũ01iՖ%O(ujjbL{& uNFƀp/5A ~`ƁsjR.M㪊GQª.f}E)iƓI_}O?{Ew̾7_߹+?}V`v'}? }OO(;L3'U74'W=gfye[A,|ja,C+%Jᴶ8豋/-.x֢4!Esk$!O0\B7DS X),kK fW9[u2g!QZx%f%cWr!+>~VZ% ksJ-GBnk#=y^z6cAQ.G\1Ʉ.{)MhFaXW#}n(7oiUijlowet%[qTYWdM9a#iMQ|hl)eM Ec٢AY7FU6b(Z҈b[֒5ͤh+D5H.!H2rV!R=CfsF&q{kvr*֎2zxhʄuS _Qm+-<U' fbsbfh_x p8PpastEN%L7*5Z < (˲5/QuwVpLh^uCf 6d/bYx6_I:M,4];OIhJhfN KQPǬpQD{HI(!#!';مw//s͈B (s{is'Z#ĸVTD #s)V^@E5%n7jJ/(..x]ut[RSAWim(IWUH,kn`7kYCd8-p$ۨE%oU $NyjNܝ:][Gv!2)qQAȘ)i<80B0˝-KDUʥsvB $It&AU x5Ĝ8&<[a .k>.i+wy8u!a6j}! Ѭ_n*q=+F-@ڃp!!ԡpb]Z|t5P\J!I;DZBHurPDKUC@ņ}̊xk۹hyÍ;5bt aOX51UǣmU{O F-O-L# ˳EU8ThJfw𳃟PJxK暂^@B`Pyy_%)/ |fv|>p7*6l|{YPE\RXR\h+";/ӡGP.ynr/^;barE҂%$(LK UD5ZEDe="nlJ=Dii 0YM0xlF֨eyR4CDnZad(QHLd$&:łKsa4eFllGě^\fi3쉋eEb< 8#\-@2RHt LAQL]eA3$N^Hxclұ/x{0¶Tn**k8?0:zG?>W#]^][&]wM>'cɃ3Ûep= c'7qJvXnO먹qʍ)E͙@8{k=́%v#.³ @/ph!vrafaL|Z5Sks+Ɔh;Կ~]1ECb  f/jvhp~3|/}[4M"0v[֛/˯ v|=z5\gՖ/X \3%$2gs¨26TT(8܇g&<{>|_}t³Qg3!`FccE:Κ82Ni.<L#x#&-#JCFY; ZDl hm9A sﷸ|L1h?xjb.<ϲ/xysOa]Ԙbζl/0wkfFL^?{S~oT7dcNSvU?ѐ%˻;ͽLףښhJbd*%J{}v;Y}vВ'_7i$z_ÅȽ\}ԅo` \U彳B7rrݴzr}ݥ rzRf'k}5mkӻ~a't sOd5{:'?YM\'/d3`;Ek/*3v|irfG&욏7xӒm3Lpϴ햪-`&R[bM TԭTWCK &LݐB ggٻ{.ힽ[=k `\R-F,"C58U+)$\*#q$:HC\Z{EM;x..`k5@PmZ߳ 777[Sr}JO0fd.@'n%::qH@Ǘ鵓 / pf,}ׁ>-Jw:uK^z;)diXWҌKnT\]LJa sJPw[9Yeǟ|;Xx}EI.T!Q ԈD+$:gSɣTbHOc<*-~>3Ë@~h`,ZX35pC${Ӫ=wIZaƶ` >;l0r2/ՆSO1Zc,0v1|2#>MQw]?*Bwcқ]n`\prBVbXJt2VcKfpܗ_<4ؖ U`hi)1AsD_w}ٓuZ01(z/:8T:@;f&F);~fv&N]n1ߙnݡ|O}^񧫇Su҂ ٠ip B 46#E,@W,۞>Z4uXWG␧ y`3Beۤ"t۷}tDPA4Ro\`{7`z%`ezM\Zڒ-Jm|LVDcwnܠC<8| Oy6GiqN!=cR;ĬW逰 9DSJx@#J9-UYT4ϲX}?,4<5P i 0c҂y+u X5LrîW߼GFQGMi7âEY緢ܧp|~I]:9ȿ36T*S{Lm.h 6xC& Ex( y{I3QtL[{q8t6nT~ҵ~/O5h5 " ;fJ32#0HUZ !"$YnŸN=KkŎ ZRcmeF"l F2ΫhdN%xd`=o^uoWK޽a'x|ȭVcؾQڮlopjx G(U4+m5s $xh8!!(yz<zzҐ'BFqdQ1eO`ÁOE}tB,#OIq$+=Wό'(vsJ.SB=A* y-)Xj"ƽg?үWeӢCgw?rԃܞngp; t\hξplŹ]\OS%ϺOg# 'LuͅDxoF~MC_g|WWƅ2V;HvϣXDWa.m}h'\rd߾Y!ꀗIb nNUv>l.ѵLBܿyyo~:݉ޓ؅; ]5ډbXvUi&QM؍"ɾ$eI|Ut-E@&&.[L !Ѹxr[n4u=j.BNZ7ZRPb} ]!6uюtȿyِzz_EӮrG5S\Å7Y/sSf/Giu`f};&3Yh]-+͗Igp5|n`<0Km,ecv!kN 3~ڨ6)CscЌăe'6˂ȱsZ=F:GsePNIQRq$,0lf < Igbd1dLe#:!E e:6p𹌃`yƙ Glk*4 f%Zfqd)u {zI],0z$u&K[hd{Ih@!($.Sy2RL;ӈ=ajcttyl4")KWjH& wgJ._Puy{wJ9rA::#]_`4ծ"TJ!rQPؘGR~C7OdjmqPXY'м K\mTtyLHl0I 乁RW |>D :c%-%dBJ9HT>peeN:ZCLYFp4# ^M@cLtXogzUKۉK9M҃A)d R J+m|0 iPʃ3(<roP (f=ْ&@X5Q©KdKp68^[ ~ϰhtc"U!ϕB.2KeN2aE ^c0=ކHA}@THŖFEVz LbZ7 d$hWe }t/KCReȰH Q!Ύ5/%yDIZ: :k=3joΨTKq^H¼8.fWZGHub~$oȊnˀ/Õe&MϮx轷ZXSS9^:pwϿJ502sNJNVjep@kpJFl.i9 \ă$z0MAwr90V?Tߒladao,=B5½W_~^\d[{qm߶S_ձ?y%64%A*J,V`IɌ*O9RΟT/I^l7ɬhoB"6Ѩ`Il&ێ93DpK05PvoԱ6=Km3H>TS %NʦQ6w);n-R+9~lq,dB\]IbJ.X3)" 61ե`<}*mqQk Bdbd2D6/8YlRf]PB%P9zqƌUdVl,* )D`).FPĪ(B{;no./ϐ:\}s(%Y.A.rVB;hUQrIʀ"GEg"xF;BEjMJrqrPaoԱ<~_, ¶4k5:QEj> 46q{ܐRُ]e2߷P%WGGq:o~r/5?M%oJ/?CLSQ?NWOku?N?d\iѪ=vRi:cu[RȚ#8pc8|зdGtItim@ 38 Hsqhw׶w;ɫifYFέPƈ4U#:iw^'y 7?Y=o˅ZrOVF}W-ݖ,@oU[7#bl>Dry>ky{v|KLLJ|1=g EQi*p9 e0!={?` K"=[ A$_l`^FGƚ<+̈́5ym4xeI|fr}}$Ju;& NCRd͊= k0v{ˇv3W#ϭk囦C C}ҟ N)c?ll0k f0_ȼ6*ixf#N5+|(~=;2-tLWJ!iR잀=FC}pHV?wuъɺ*¦_NohK8_܀ě[:r7M?X76D?Mn){Z"׺8 FrN41%e1ē`@-V]A4 \+LQ C}Pʩ`6ܼh8tMD̥_5m?ުg &o,{!*.+\U٢I]ĤOt^TT)AT,@x&(?{{El͹qדG5WOʣZ촋L&) + RxC%^*ߖB;nsJxUe6髚Y296 B+qf@*BJEI ʕ*:QZ9vĀ^ ύ쳡 %铔hrP9[|:<u$v,9#S!9<2<ˀV +a.U$ wYۺP飝.lό[@o(o޹L$J Aq@G/4ӎ#RtKpԕ)ӁMTW9%錪TR*,tE7%=%}E8v'e_5;RC2RQTB霹&X^0nYy;M(olq޷qp{]~-ǩGiityт+:tBfht4ɍ۴40^N6l,cfx,)-Y0f(Y'=C =qnf9_8nUqEkĖT@n#4^ Kp:)&f #?TbpQF)U}| @E$KZUѕL= 鵴5_/E^ۛm@uCu+rj. كtT/4wܼepk)XQFi>ЂeqSYzAoWt:Fy{D:wR,[]I>+q)C]Tqd j0uI#L ʤ wzil47E|^\)qʄ5M{t7lF:xI+%]|#25 +T[V\96mOhz,Ai8scJ'^At0k+%6d2dxITHwPvwYrͣx>6;03_[5(E$,iQRXg}_{/YtB Lӝ"%,M8pAJ&raB4Y4moyLJ>1MyH̷uSTP>?xzfòܸ#\]QCRA+@=:SJ%Erzǚi! b>L2 ^AzӁզ[uu]I5(2GeM T&TG,8K5)1/džJ` (LJŬ1k(X$049+1 Qtg>cEȿ?jz #,HeV9ktHTcl$CRd+e]-mHP!ȣ@Xd!$@޹H]d MRBG7gXyXcocQϤF?=v>K88U}~:)qHQ!'ߺmbo;-WtKO-K}?9Auyɢ)Of4WkUEx)L¤fuq,/39\j=cВٴ 5~Fߍ?IyM?nQH|ZY%YPM/$f#Irb ?g'&p8;w)vߓ%SnH񨎹#۪K&gIc_ )q3 AH,KxMLc?\D+z[)qҸ97F?~B"m]eA#+I{N''b\zU5u?OtլB|Z>_:ZdӖrl@;G}L9rA::+-K,Tzlr4>Ƿ1MCPnZuC) 0+IhBKVJE<ĀO/wհCPةHowꌑ ]RH->I%'[VC) bfʚȑ_a#q9cb7_z'vÑ$4)loX(JP<"{j4ͮ.W5ثNF ׫z%osF 6(]qVdgЌ`]ݎ+^ٮ&&NU~ \R-8]_wn𧗮2UkݮwF ƻ#Dd Xdw $ _ )[Ex;dWꞦu@p<[`=-^>#mo߼}Ý2 3Ҭ'-2ktqsS@s wU7>#~U|>׼5ߵSWy8XfkZiIft]z3yzc$]o%H#tܺ;Mۉw&2'A̅A$,M62EULTΎ\N_oM\{-䃌q8qˏ%eoGJ֯z2rQI@s(%H0hȲ^1Ɯ4%BԆCjpD(ܱq =00ϑok)#7A u)U @2^V2>J8G#@yޕWzJ|8߭[I`O_?n2' $1`sA=Ac?[!p}e8ygm" 1jC&OH2lLb! -ѥ( 量a#VXkdvWPmzRT۲'} ^]ξswQ:09MEЏl|8&7.xtJ0tߧVu2c-Nň7B gjGqؕseO$qooZ]Cn쀯T?ZciԶ+t9+] YLptE7"ܬvQxɋ][/y.1+Nݞ.5߉9c;Q\ǖkݤh+-ލ#췎$4H RG)jƔ~$})\_a6ܴ.m6j CBʹP JBDI-_0iP)ތsp6twjkw!nc F:_]6i?iFv_qyKݔJs*LI;?:s9DwKѕs[ \\wh c.;Ogs9Qj 6ɂM_+$tkkdXC֩tMs$twP|=9y%lp+d];NO2_>XO;cVي7N[TOM姅>,'?*o@ͮ״L}t*8wݰ;&C D-jnXQprʾm/.f!wɯ/SVznr¼z-k-m4i8ruV@^o/.geϙ6HXtT I4 !8keSvNlH,5YKu됥9MA@wK*[QV/&Wl,զ]PS[~Z&$N>jyɾyHf:(./`eY{U}ʢ-ttkH^?(誨ݣ&A) (̜,G#h\9XNHG#0YeX59bW1 `U&fY\jJFM@`&զ*հg싅2B/!~[yvU45i;mQpz=4t&t#vRS 2(Wb&2Ԋ H1Y$|!zd:IxET7T BRk TpduYKt)`lPZَ~@,QMU3jَQ(SX싈2"{D #W:$d 8Oy'8YlRf44;GmUDt3 mH6\F" I'͍L6U!ʈXm:yEcuV=qQʸh{\qq[@(#ZBb'J21<[Ҋ%0jAx\<<!' lK7yv Z\by>R*\Flwm%]d܏ꗁ &A`!;X cЏjIkԒeO0=՗E|jڀ$^^TuuT&ڙT[-|FEh6W.7ގtЈ=SAq.űCG'BG,N6BFi5hLF %ZSbM(,5(h->t@k@;%IA&!ӣMh.1f<\:ӌԻz78M農<>O_z˟An}v5ܱis`8ذ]ۺCNr2RKynAX~yj.P 1DR ںvDJ`jkZ*Pp/_89?~39-xŦ-eQӤwM=Dr} C7yO.{${.9\}@'&jD";? ֻV+8Lh}>+澽w3s|˰T"=IуRĻ2s%h$V<'\+QMe4! qRicw?^~^POO Ă.ͫe gʶUמTm/MDG)pj}8r]] G F9Q< 2M?wep NdY 3t^(qA8`5i3)*;W/ӟWƆfs C̆m1 !(7POgJYy: [SZwGs8RBju݉WC\'] -wׯ+[!8.>V霌qTeقٹdi{nt OcUאl&G& 9u)&P V0>L d)m!-ɅKZqAO*oߋGUf'SC&dȁHlMB PQp$Eo*|'\9a?ht[h·ž|jMeh:Yĭ2q%3ALLnsL,8c2ȤQfgHl mZΘ6@1*emLD.v&-F͍D#HK伵"> Mu 8nfuNjRr]]]6/< ޱE ٳ[ (r.3*TäMcsդP{^l`Eb7O]#wF/b ke?(8zc4S^iB#d-h)q92-Ȼ|#Ag~Ν]#?|G~ ll`s2ZB?& uJ+z.3O RpJZĚ)۲m9Eñ*y# )0ikZ٣.SR#; kN0B0~)x|*3}P!mGҸxuy,pV6Ul dM _njeJk5@k_cMVZcEi#Yu|쐬zoY\)ez\t۶"ydžmZ 5ujH9'_W,$sa'PaP$R+H7+o0,.E|[ݔ,|/pyCB4h~lI-Cw߉\?m-㳞( Qz+}xqZrjZ B# @1oMbތ%EfEZ$%0ύ@ôa%(\9Ӵ+ >⻙nlW7vCZpVm|eꑈ eء#%;'\b9@u;~jÇQD| !B3k$݊hh@рdqFAm@'$$cE_R?=>$Y a-):r AۣMhS"R,f]<:|诇~j z^X ~=7H?a@{-s՝MS?yXW~dy8k.P DR ںvDIat}|S:YjtcUk'}]SPj4!7k */D0h;L}`uj.gy`+e0wep NdY3t(㬍 ѭkUٯrY~.Ho6l(ozzscoe}=ۋtI"D.TRo%2sH/}\+t&K?3R$:|?_hr6 s#кTTA+N`H&2\6Ր%$L :l s"F2muIsJY0cY4NH2 -ǃꃴ\ݫסUz˽ >F%?Ed80bMblYg\_92: Q5Õ /2{D.K `]୷AJ:P2Ȼ99jr֜=}M~ܾ2}i-9D#?D;~XvɖF>e#* <:EbFV*IaYCЗ/:1?bPXKH01J ZusC K i-vjY4 T"2O'祬ʺ,OG'if2Xf *E͢Rtc.c$?:Ҵ"`:煖T6B i1I*g:EH J< ͎Pȣ垰BH.u3zR >i\Yqzcf|{=(QR+o+O?1<4%Ht (5Nzǒ04Œ#$CM}?~4h/B1`Y2:#%8&ԃ+nɷ*>&8+tյђF'^BxBk]3jHQstH*<UEJc,oyWme6o?~{_CHf}+le]nhOÇ"3鏋`qLf?}k!rK?7>\< .Q؂ ҧnu!=GI2?hMon -}T+RF`6c~,ڿ~.]hs|by(ܔ{`ʅ*i! VJ7yR :5ZV}~=G*c$H,g 4.I/+sRAbBTԂ[he]ڝصI.\O͆fҙ{=B7OƫH.Ɵ[Ya:-Dm\ͳYZ[4(?4NTb?ez0Z8 AU%<|:7 Gjϴ YA -eJcN|4 Yڜ3~M6[+_lg]'8\֠3y. `mϭN:%;nq QhRF؋Fȼs*V1N1p.k0 " D9L'Ʃ,Z-8F ƓUm:q#\P^^һ X(uOQW/8L*d$#J0f XDd msV@9I $Y0NhhmiF ӛЪJjMhOx`ZM^*)$2Xrb+SH^꜒[ӌk z8.9^q:d\)G86*bd!P i1?ABXWM k4a8dch[e5S|w!{ %l<(ξr5nT2."+M@W(j171)Q&O4fm.[ꕱ[y.V*L:7v%+o;oACH9Vk'mt5O%!e?]>75Hx~+_gyK =^=P~UZ#OlW{ uVّC]Z*עXŖdu=)Ċh?~U$ ݕ"IKŞ(1/qW\&1Ydt8/ѨH,U0Vo&$) )4_ X%|̞]%/z ep4LCS /SP{;]@>t|WN2 =[J'׿bi[/fL@010n܅\Nݒں-{F] ˆ{Ryf~ebjѩqkA ڄ7~ʬɌMʛMٵ$;͆Y܉&"mf2x^S{|9.0tWTQ7$Q9q@1^)q9v؂pi$>9!e 5tl7xN n$.2/L.+X4QfJr&sP)Ŏ1,ơUz(́>]0oTab+LYu)ߍfF9JxR{SɛҪMic) \9xE\距Ypg`p- 픢^0L;MΫ7އ1o]ufb,hY.\"NF2)&}%Ya0 1+2*lMyJh%$^“+cV@9iQq4nCfW#i@../Ex5bxѐÐO0',iRԕ0ƿ?*_3>*!V ZЫN(E_:h{Ġ_ bD?7s]*)U(y> a:n8, Z8}3+%i[5YkJJI(hs9%f@3rIB :ϧW___^ć(SШo){盛uW8g m#AZH2"Fڋ.~?zhDH 8y 7Q5ќ :4ԑ.iVfe 8#{vJ`R4X)n=@ ./V6HhW4 q=MM++" 5 Emh#E99UеjW8d L`cK ]c9F#M+m F6SؚB::8rZ\ͯ>txe>ݦսY2{262L1& ߔ؅{Q #)_O-P"p@F 'CuRyrEoH[妭rem gV֦b+mM#TL*Pa?ͩݘ}F׽q<]JȵE8Π H: g?/dzr;C|jaҟ<\)v.ߓy5&8`,$CPlWœbyV&iMr|X510YDօ9 C gc}pVEй&ҴH(+8M %Co8ߟy/>h;oֽx]#[E1i[揫,3O2chUrSyp_VL!ѰxsokȮg8G}a^-qKݙRGN2aOG7g3߃/#oQ'!T*Q7GwJ RHqDXAEHW1c= v[8|M WN'67*)'(R"1=x62qeY QXe%%bmOm$zRN"6ٸS®QA[%GJȈdoVO@*AL qHc9F-SF6,E`KSJg .e 6LX)O\bl-u#נûKv$h`puT xh|{d^8`UԜ=μr&Ȁ]jAfa3`"62,x; V K-7x8@otsØpq$2Knw.,eX*)I;W=)jJ= `fOLOuvW:w@}W1 .W鶮n:ד ܴlsk^6`GjiL|gp̓Z,uꞣ}k5=;VƂ`W>PwQ$rҌhBH RlVݙ[vΣsrv!mQ26H/vY|y|3M_K Oz:= JIA@l08饃W&R#L!jda/Pdy5 0<0Hq>G(5BDfczSsA$g#2;3Wq'7+8#&ζK+Gݭ)RZ<~De8HZɨI HRRF JG5b9OLgq3 ϗ<6p0$6sg9)hDO}`I(N瘃%KSy ]waTtv;qQ)RrcՆL5dؘ!ӇS An+5>?w\MXg [~k/CWS^T}{JW˴ B٭nndt>GnsT[Zoӽvo|ަ'qwUcz5J8}][N,vkӢMozu]2kf]v|U5ҏ@<ߋA)nhHva>~+]RcO..:Wk#y =-)ľ┸nSObm⻞= eO]]Jvs`wr.˵| z:+m]4uM&=ʯQ/{SdLl/{/\[̈yyEgۭ+ǶyK~{o!7ː5h-b:6qޫ~>#y8e,7 2;ͮ;77-⚎7/B!L.ӻ\o|gZD2jQ6tMm Í@?y3t.ށ 1aC 9Bb991T裗n)!/W !%mFWE2j]Vt!r~g7ƥ>L1id)S>'RD,QoDeYVwd*{YȘFt%!MOgBcp*p5p{yǥ@^cA^_{5X2oS,Rگ+,K! =WsqaиQp1<=ڨ* lkdP:T4z `P~88R1(wK}ydK`<,mxVdDq+Dl$[R;kHVBe~=ҭ⦂Ms|#Ju3 NxHWJdUP|ѧFbH,4YKuèRA  =HnIeK7ʊx`!IOJbPZɴW6TVjyXQS=-Mʊ]PMNv.]P0f/;j"'q1gp\JjB '4OE]LB.2jkr*Gt'21d}Gs SeXm8#c=R IƮX*cpPrW/ 3^eYOM&g/.gjA)y%N+ :§rIGDV* +¦NtVdI{$" ^`ShZ*`4l;$QrVۉpFl?&桠v5uej vSq%eC'.%YDdx)Zr#lYL`=T]64O1I21C&Ɋ. kbP #a}|*B U3jy;~ b5MeD4"̜LqsIɌK*_dFz| n~|!GQޏRB8"Cp-cH7eζ_[>;6[H wd#; Gv38al#sx@,Y ʡJ̢crĬi(rgy2hf<*:Q# 9CZp)#!95n\sbn69yys\,5h]%|ps3W 7:Ē |BXWWݬo"SōSk$FYP*G ZIk_kmVZ'O$j7=Ys;9 "k dNd t&4XEXbĜ.UbdGɘ! B3A\?A2=,k˭,Vه |9KԉiZ cA/Zcb'E6Vƒ}V7[&u.g%gu_2EeͪWmoj v}U`:M|T,Vd437~]7n|xԼK Nqz!ܝXl rR.d|z1ʏ;E܎XvWZ vq(bEmޟwU{;uhr]Q!^ٲݍF}ם7ed5i˫(~3g 84߿Y`)_j{^2:>>jD3=嫡)H2m5?*?7oآQ %ҩ݅X%и/Tn14PNQ^26N8 |f:/N7p^ƪԑ"dʼGrή5!/adܕg$|oXwڽ M9tE o~v{,֛qY1[n%3z'5L/ڕD-/!ywZ-Jl͛;oB!Op6oN ךxnh줙_S7w&ͽxcONfӌC}! 7 t\^i9sW5y/f<ލG'D8Ɖ.E}"\tB̾xlq`OˆfMg?sPǡQfrG<  :WrٴPh//~$_iǡg#UۻR"|ȽW5AVe˪EpC㙪)W4M7d!oHɅQe򯥎6$RQPx$QtwG`nzlHSȈ#H̓4bψ:"i sQgй@~OE0x`wSwd k6=N0xrWnn *>&n̓,;tX@%T\2r!MR 3#Y;w@V^Y`3QkRs1h$%3R =3V Nl'zs=A7}oeڬkyvC3.Xf,1F(HIaՊD0'Cd`b%*@!>S:E}#6FB]$P֨+0dk27 T'ʦ~jMQkn,U-nb3Ůr*+\jJ/1CJၱXjD6(SZ8SG`w㙘Cn:::J t>(RZϩpUS> ՠn*n'U&*ol5c9Ǎt_ _X.j$(T>UzCs(]b x/¤X=ށ(OC|v;Ҷ˵mr=MZJFz&|(~ i; nGЌAS9wītJ֒W˅ڃֶ 8s+ ՙ ZA*p84pnđ~Uv[Ƣc\Z\fT<X2tix:=B jT_jU׋kY&&VQ)$S1f4j|siꯏTk%vZ5+@\٭=)98Ez2tܸW?uayG #r.49㚷@/dtF0i,i`dXJ9hS V */[Tl0 $?UfQa`.~YzuOWS z쵲2aDDX&(qjnDh:vFz'ys :~fIѹ#.@;0g a:?WF 8HJ-HʒHJM.;P@;^Tہ;*g?K{ka:ãr!8ü a0Azđq\9qZGCj)rJ 4!Y+;.p) %loP=^#[tV޴7u+º7i&#vk}y E6%BZ.=7-˲!¦٩?rfv:(}:"m`SNr(cm< Fbf8=6[1|/Vgzg )TFNaQiƝVc  BH"`z1cPEcoNeEmF kHրTT (" ;1T0c`F:;Y~?qjkbH*i(̆hZim($&+*x=F:HD )!r*7,CxNaX\DB(&7h tmd.< /t[y.bۆ2 0_h%e\>/~4h2O}1XGyH$}˖\>,I@} '_ Vf+շ~_/=̬xmV)q:vb >w<5qA1UZ.Uʹ{ɾ/@8հJ?$0]I_u$IJ'eXӻT(3 1x/ηYӠ7ƅ-~tV hyd)bW$怔 vV)DžF"rq..t܅G;/`Oxƺ5*a`i&$TF#UxzB^GCnQɢZ,/oSAgMli$1_FTuxv,XFT'ykRIBնb̳NZT?u4)*vM[pŲ[Gsf[˫D0`<mIKP?Iu-&{ ;HBqxM׸0ŏRm36[JR r&d']VX<KJ$Z*BY=0Hw;A26F3SSaq{l:qr^Ty!MwߠoG;ߵ(omiO z BB Ai)1H[%;u?!v~BB1Qk%E{ɽXzSPa: G&N02FvEOM?3dpkt[9.&WdKнkm(x<{eM7i i J9;VӟqNan **YHbiO}Sh%qEs_ҷd5k5f,`ZFL&Z'%"s&.t:Vԇ#f&}UMR6`,nwRNߛΛUkɜm\  m"|,dU2~=]͢izm?ZI<.WI@h$Z/` 1FWFѻ]s.odw?J ,adYkkw# O^BFևtC^suw=w$v+/]P-|oVnx]zGJ5ov-my QS<9 ?L@,t bsŸҧ<ֈ}RdԽVJ9HB^|bpɍCFw:}-߉+CĀPGN*m.h 6xC& E^Q({o2Mhb]GоI= Fv̄r8d"B]wqQ3pHriMBGQ ԈD+$:gSƟim6ul%RqA8a<$$ŕq`O]ӪRL=[Ok 'b&>%?]>7-*x qWp+-~G~{XnҖxEħu귤NEwR'IԨ6lY.UlM& kɡ"eH\*D ?f.n {L f!Ѹtv_"uƞTU\}M{0&A\ZP*J\L݋Am^yGo noBx9h^T KPǓj W:~hKƿ8oͿcR#G}y:|w<9#O=Kto|w(lO#O>}iy@t8c|?uU|&ɦZ}s@gN4Zdnܬ7[kz{y84>۝Hsq I" I)8D3OU Igbd1dp!%>YŐ*26py%s.Io}1Uz=FѪ=ZGJpw5oMӡ?TVo)Ѥ9ZԩjqS],ngҔ?f]k/ 0[ICC?rTX7>Hg~0aJk(*%1DHH7B'Mx847^N.U.T2㺼%zQDOSQ._ +ee*aٻ8kWŰ}!/ I>0j=. 3(69$6i] 6Yoݺ9umC:3ǁs} /L@lV=%EUװ!ڳ-Y{hЋO,ͫbRZkH&*sVel}crq ތ[Ž߮Qq{uK+2&jMoX jFK^/">毗.[ MS ky֞fW'Yky*SORaFbS# $*We=sy4,!x`<3к [ni$~xCBCR_xk'_f<m꿸1Qa PuO(ly8/FJ U3*{Weꌧݢ3yԗGـr!MY:/Vgk蜱d-iseNh{,WIRjoZy돽O^JO嘝[> |qp/֣Y"feM>ɰ|F>B) w;L E+)Y~f{1jFz_ʱ%eB%ilMk,]ilw!IWdO0oYv?wO}E,hćD kwGhg~h6?Z&ju6㕹&9Փ,BD]=6^䘬VLv$6F%6t=C>iQc._HU11>\UZAI{Q77xbdrJcV۝]Z*7spm1j@.T,nmYiR odiu͛h4 cDN"4=wTcM"1])'f :#M1E{) wL'%DNC#uJCٰkd6J#TEt̹cPa2!cw!0*S=^bx1;[_:GBF?{.PFld:BG-8z:.cHp+KP-"VwV4<¶MR xY+A";>mm{ٗAD>"CO,%VXB(;KSvz R{YyH!%:HJ1&FuSAʀv0ᡔK p[O hg]+} 9.z-b bhwXC-D;+f bro鎠gC<3PEȚB\Ƣ(άE`&#EP]  VjpwFdUU%0CYaXYvcF8&gو.B@F:QbզX5{-6'tXI+zli`F+fVo- Jti[ u/GѠJw a:J؀"`>ߦ%La. ڪmryXy|ml~\q2]\G$Yډ\5*CfU:9jѓХE0I0Nd6YǨm[SQt5(U/ L۝WB*xV6xmh*@J!=&My3Y>PU*KRE"r@0י+`et@VJe=+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@J|AJ Y!ՋQJ%YksW6cTipgJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%ǫĕ^]ڿ%uK@ V}J +X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@6{藣r(`^ +wk@#+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@íݫ^}X=4q{}\Pwi/"5ˋcOQ8O5~݊btIryZ_,iqp]K_/i{R(}w9*]=Vo|uyV`iq.sq4`\n%80[Ej! ?wW[J7Փ7mI(ez+KCWM F:DoPBP(ɑr=>x2I)nVX+Q6Y9В;&pfueXLv,=Pŧolu~)vE9mAAj[A{ݏnh*J v]^= S{kCz|H=i0Ny0'<`f`CNZ ^Z[$"xtiy:gIGYTM84i`X|x5o}Ə~+ZT e<01:hQ9nVщBYKr/^q*Pip` ư͞6D! B[jNG'|m}c+?ъ/OďC$]Zrߗ߽==U:i0A.V\,>XnJ]Nng>eHAliUA\XDY1Y-3 Myu\ _fmj5.5|V;;9}^vjԠ4T]Tjw5:A/~W7 o['/օz'.޾ZɶF͕jL[ލydKuC1byg,&VK*jV}45O2̬Nw-z7_;Qi?'u`(W "J I!1ނ3wWtj{68l5s! RmOI? 3bx)v ˻=\},[q=kZ˵簪!\YGB˸nd [wǛ9<\]Ps=ao2,xc/S"{ֿsra+p3-`x϶` '`x9x ?@-^<',< Zpޭ3Tde`ȕct F&eJ,sLzH.n%8,Rn*'WY&5g3砙ّ́P.2 JZӠ;S.Skz{5΄yo&uMںXrN > օ`q[0!HvI:Wa|qtNj RkͥTokqZ叫,3L2c5UqSחM)c'u? P%Z.c~[6L/n0`/̥[lY&ZmNGNh7y&2':4lT@(,9sς;s.CccK '/wQJϗs/1+ot="9' [!9suP4eyqcHW x0BNmŝ:ӻgo{C!~k<IC\,Mo~8/pS fʻebȁrz jp:1ӆ(RҹMiq?ɵ=&Zd pA(a/Z7QJt*beȣEdKNRZg.e 1mh' 0+(UDs+C:&hPXj03%A"jG%Ӟg-ä֦&N} ۑ^(Ǩju˳yg~!n.9{lP8CM ldGͥ}71+'W٭$(W^dqՈ :C8#Ĩdb&g!qs )K$)PH,yJ;Zĵ6Q4UNxر޸tXFā CX=ήSFx֮7 ȦNo8ƏEpfƏ1tu 7'=4JGZ]k<ɝSXOnҙ3i>eW߿G8\]F&s8Ct1fO^^~%AYȢdw='42r7?GWwǁ.܄G,tY`=e\YkΗ?4-Dg9%_|m8o|Dx/3ju+J۾m>H}aGoתK=X#NA"Ut"#k &uRggg2M|SϠ}τQ6D.9-GK[G"(A+ h֔UjjtYJ82o.h,Ù ?Nv{yZx:^.jYνPT SN+o='$$Xp4٫ih˛q;)S @r&IjSUD{jE#hhZ%HpV‰.NlzW^ "yӬiuOQvWV7x7xU Q`dIcw`t ml S<(xz:Sz:iѐ2K\J۔Lt!#RH^%:ӎp Fm(Wo^I;8D!Qd#988T[Է&gXuhQ#MG!}:m C|O%Ԇ\ zJ/Tqg_AiQ(O86&ޱRVNƏ)>5,<q$,}r àJ#x=Tk/jln?Jҥ!esnݻ x 4żhM-埸KYK^7 nB(lP獨#&g2u-iD-kɚfRkhs~u$);#ȑ[ŷRTHu{r:q+OGDzT2 ,uR*ԛ Ik+B9(qS -U/FJzȞRo3Ro7\ԣdҫT BPg_N|k]T䟚06i F3WJgi׿9taPޕanw< ¼O2=WCtn9+3_iv5So)#5QN1gTQTMٴc<9y]7ˊ'цgY-6k},gP)Nx4c6>&&Y&ୈPBGmN:L+əE /bf-(,A}RXEIK 8%ƭħb cq:׊uqV&Z㍣:L';=3JW ^Wsn4ߝa(.S2L^%©ԓ\i#_ZKު +HŸNyD ]0̋u BdS `1)R2xa aahŲDd/\J8G-@jk*x5Ĝ8&< Z~HΩjsᕅ{.Il?;J,׭C0\ri @{0"_BJ x JtE\"ZUYUʥD:A%*T'Hntӣ l.4PuK#:aRri5ۿֳVOG1_ bAoGp75 k(R X}S[֥E'c wdժ&)lcT#V=j5KRΔ+'hcL.ْDσu9j]VO_֫t0?t15|~OjPT~vEoLԦ\Xv> VMtj*cL/!=VJmD!=5쭔.~$8,yqaX83B; ʶvMgVo>"? (]Ay%h, ȴʥ|jrDPMc7y^RFw4N4h;C*rm_MtyEّüc_vQQ`TKkKLbNAgý{Z UIJ e;6$",12!1Tt/x)UN|v (Si Q b50såTx(8?EDrD#">fIZ72d&(6cʈnm'\"^-Qi'0ql(4'dngRIsuՑsEN#0sGğX\\lҽa^'.5pȖ1[TnA Bë;!2V⤰=Caޱ/X<\gGGٮ .sw?66ծ7w?A 8~\g pU!Yϭ~uZjsUyU3;~ ЅHߑb#kQcI &{i*-.`HQa]RhJbG' hskp6Đ ׁN VZi (sFzˇQ5߾p|Hw~\=5Nfy`a!!1GUwi/\8q9K5WUd\|3W5g侹.ϛu&$~$"+~p"rDP0Ԝ96_St[kZ Jc*MxDZrJJ_ ݛe5䐫=nN3x.~s2JX"7xJ%23%>&d˺oXϚ /lh% arԮ#s}suX)`wVX|s(ʯ3\-e}21[SAJB\9o|'rK9yU٨2by},Vhw^N}1IdyL&Og_2PWe ƙ֜*3f/?]]^.-*Nu;'kme>%9AJ ?q1D1P?n~V4ɱƘ4H\34+MƒJ&NxÒ찼(QE1j?AtNÓ(O?%Hfgi㶪;#W|*뙃Po<A)D\Cz)SWXMziwFyh\_\(r^E%XOuKdk[5x @Vb%Ŗ(6%YEš6I J$!'2yPCᐡF!Dʾ2b5;!)"PMn(K:lodG{՞>ЮKϗWZ?_{ 'q>Y]+?o< /; wU}|"kwk7Q"+i4Y)dOc}'oYONW{O>[oT /MqTr;͖K\_kW]MvZ>;Y$R ڇza%K5~>;?#=ȀS+]Й؏ ]n[l{,Wܬ}BրqOV:m[L .$V><1f=e"qyϜ?g|d17w3$.F2!5s/8g})Rщ\ *le]h8Sg}6I=~X9ꪖc_ێR٘)+zt͢R"Ro(dVLoYߗ$A|5$)Q(6mRѹݏjBD#?5d -';rܫin#\k-֛ ?ב۷7TSԚ$m4hfRE7[a2YlǪT 1 2pX|1f,cƤ[+i_{C)F夈믽G<X?Wj2CR&`a *tw(5԰T@ʐ1A)W,4ʚXZrhYlVX~|m=W$8x!X^`##6{x^;~8dL:[H!P2hOCvox"Ybu.<7'MUDsM%H%9RjfW9\ Vج#m?(9Gii;XRͼPkU^32'1R Bڥ" %“"ȑ=e!kՔJb;j)!BhIR/$2iu YR4RFvTX I.%QC@ʤU˄wh$Z3e)G˒[nEE FGx TȃsRsN3o5edx0Qj`yU+}*e Ly28Fl5eiX⒲Ed5VYAR#}g,~/@1xil<`bpW O%bjgX:x eȄW&9+HcV# SHUQfO9e:)|CcngdX2қP@p #7AwSA 0`Ɛ)e2Fk@$4 x&XM 5nf2%ys 5%-`,Ck%lAbedrTsBN@Kf) lwX;3ڈ`bLӒ/xDH)YRo=v#"hP)Im`.ArQ+by㠈Y]jPa'j**wJĪ)ȡlMR+K?YLg&*jdӛ=E(;f!mRފ"dU^cA-# (6l5P_ی7Գ޽?Sѥic`{%΋yeUi|L8[u]DόS@lDG@35Ɩ8°n>rHXuVLk̺?k>n9BGBjQ0%B`y_O oJ3Fd0"ZРD˱ZT`=3CS e, i y`́zP}c^öPBovuV$KBi޵6qۿkW҇DM*7vV>$v)"eɩ۽xhu-b13sL")oaol+ZaN;B -J1<4RXd'UЛg=k TmXxڰ(RV=+NFX"KkL๑H w 5p_8PMc5(A O֒4. µ#k$y . xYk*R3vG @!on, ̌p@!,+Zc/j2biL op%y )H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@}FJ / gZ\@Z^ @H)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@HI  (V>%ua @7cH 9*@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H (n;Mb\?OyJɓY /G[@ġDR`&p4  \20ѻpqdKF*Wgkz3L:: F(]P^iz֣Vk5G~12tㄞJ|w_59[Bo^l탯~Ii^.rq•Waŝ䊪a(a\49.e\ Fĸ<$.2PpC9*0:AI`?cx&w04 2_/r:8?4O/"5w0yz1L=bףᯗ0-fy۳YaRxQ0@,t[ Yga4j k_$ƻy[\Ρ@:~Wo^nqvy {t<&`]opyir|yG386Ѱk v<_Jnto̓Ñȏ3+-9S*㷋z+:̎zkid*Bm>Hw`>>pFkR?ߏ@b^^ >"R<׳\5O;L_/&5wmל]NO[8V~z-q%[#咃I8~/#42C8}0Q5cu2>->)(:M J&5ccOЅe_+Zd|e,xj0y;F4tRj>~jmdcĔmn5M6l咻lm6?el/i[1ߨx`ܙ4P0*XO(ߟlU`՞0/7锝* r&uUSϲ,UbEߍs_ɦCߌt{$ g|w1eYTdtyz#kSy..7HbZOо_קS6c\rg u`Js1GG +%*ɳJIJ 99Z< ˴,28VY'nyY0̌+ I1Y:|<4yc;k?p[~sEãjS )kWp|;:1"(ٻbYJni*6F TBJdylˉyٛ LƀüDͅkQ=N80ȋm-4tf͹߉k.?3wa:|sяn hsn܎{B>] ۱-Z3MmX؉b/cg%D!DV 'Y圓(&x侸lQ(fْb.ހe\ىaybKF~_<-.wyxqD[ކIm+sX@,LA&'T2dNxKK`r2reDJ:H %yCi֜:H;H/g񟹳hEHVH!htffU.iW| lÆ\g.d #C!gy gAЛ[4i~a8N_vK<cy[ +v$%VVlIY:b0>Moa8퀂wh``V}2m:EQ(,xZk"Q谶w42WF;6DWq~ZD sk/-^ʢ?ixáNGEI:M' ,v1Dѱ FXTHnYv6ۀP?]%]~s%` ܻq=x<{mI\{t$H@ͬ39ɣI_Y'XCxiSMҥRbu8b!,*Aɘj3fطٝF2fwjK{h3k]6bA[@cqdqϥ)7 )n7BN0e>b~")uB:)f*Zd»ߌO/=L;Y)QLm63ΥV$IsOg)BrIH"H*锋\'B(ZD*&\by#nvlmp]yY]J6dZ2iEEĸ/Qs0%dÛXxv<3dSnq\nCn֜=VxZ7%-=["x{)zFy fӤ3y)_]3BN4ΫMNlԻ_J$EΚ(:UP^t(.D(?OB)PJG@^J x|/pIleHp*ye}e&8{_m3l)pEd,!s 1zT2L\C(੸hlI( TG@Mc[>&1Q0dU\p<0j+Y2R}r/K<ĉ5u~7tf͹Q˛> `ipz}ɧof';_E%ϧĂhz|<J0^{}wсMNj𭍏_I@6,L{C-ןK~^Ө > #+Ǣ2g+c"ElE(ٚ3&`~\zJ@m> `ժ-@3Z Audl֜#c;,6]cn 'aߦ7]yN>L|Ѕo=J@(W`EY0BEħ*%+Mm蜬 ^Ȃ6(<: mnǜBxlu]x(Dfֱk6MΤ┵.1 C,S`LeB +pY_N}IEU4Wr,:hX K\2!F!3K@Xn֜aO'10KDlfFD8"Zx%3y@c̢U'&ZJKc[DD8c*x9X.Ub9&)[Y\J"U-W k5g!'^u8εOkͬdǸ(Y(.R\ J+] hU ,]R"jR |gPS YiO%6]!o{p<|L嬟#] |9 Oe=ޟGُ(Qkd~*f'_|UqHO~5 zCL&kkA3W+Sx\@.ߑQ#ߋ|G%bl2Fer~lUi&I}Eu \VWfCmtdp+> ǒ'7G!u>f+CY5n՜<.SR# kNm?Wesy⼀7aEM0_&O FW̪^Dx UViU2:ʹ6@]PbŅ.J.>c\jNDòsR*2cYPW,XP+mhd7> $` !c}h(E,$uڔh nvUSU$F'ј2mqZBP0vY4a#A4Y@SJx->c:⽫ji0(袲ÖаLt%BKхsqwb8@U)P՝c^)1>:sU+_$XǼ¹`1X>]$aMq3|%gS4a ) 5bӎ%2`vUy{K&ykNhxi"i?AFcMi7vmSV]9^l௛/ pC͠]Z@PfK=n;btW4l˦>|&\njx[ .[)wÎwd^CځV58u)! A*@\ R꘎aLz:V13NcN9PKE2f@!qPp|8t!JFâCr(`y;Hc ռ} {uƦCc]Zʄ2a^!RZ b wT7Fsc%bDs\NiPk20[9#Pi#v>ls˞;(gxN՗f[9Ti_.c %f[ #*19:/)-lЄtىu)).0!S&1jveZf5]b;-m6=΅^yb\Y /ki8݇nGK4m΅GWg ӲF *ǘrfv:(ͽ%XcJ8XO@nR _u"wÛiJlϣ7ΨHFǤ3:u* J3K`@BJ  ](:Xy,* E(侟MFq ~iam6R#5 {kFk@a**RYf`]Y@,B;t ga\KSAVIC 6E+JkC G ,$1XcDIQ^}YN`v)lH ̅Հ%D8o)HlmmYvTK-i9 ?8s|r'>OgPL+|M0U>@ϲxS)bh#[xT YHCaX4+)AP邢 }!\d7Er'&Ue(3 1LhX۬qSc·"MjwzIn) bUq!ގ;5id_N3XS>kJ`jtb@$ *<]E/0ց2 e(f `q}~N~`0ei#y XMNĝ: 'J iuy.!s|cijcB;T vkmR6`=z\!r \`D$4}'Ğ;{|BB1Qk%E{ɽXzQa: `G&N02Fw;EydcO֠4^d3N,sX%0l&u֝0 tdtL &ܣH.BB-nrm^(/P0^OJ>Ża&:MS'#JA24 7TÁFcHۉdGS:|6s[Ck!޾y!V/ tU埽&]S+R3EyR&zGqUXg]>qE계+fh`>irW4lu !-8JQк3;S[mJ,S0T4Z z8uty%rl;d{= ͐Cxc_]&T8ɣĥZ#ˠ춂 x$ )3;cƌL"^ˈiDk45Dt;-hC#X*?V-i&X/;H׫p{ݕ?]vrkצtU>MBI'\LΡu9:Zo' h\n%AhCyd{wOZLWZnh8K2TA[?\VQo y]|mCkv!R&6ϹAE!q/Rzls yYֲ &F*Ǘ<oDM)$ӓھ~srD+B6,}J@4~㏃thl5g#jykIYL?/jJ `&`njVX)vYfPr頵BbܱF8FHW bx>wQıci Q=k+S#HfւS;.xX=g>4S@]:< P~%h{#GZ?y"_Z`080! 97*IWRkAZpBBPF+O7g4]^4^|mG9I qdQ1RZ5F2"JdDK5:x5Hq/I>"/uZ4l[pH`hWvwI;^B''ꀹ\KAv*., V2KEJ :g6*&2`ZR>´RDx`!ɥ5b` b:DiR#ΒbL5' -sʚw-N 'L]D86ih7|ٿwYM?u odm*5 `YULR̰UxW\2ܗ6ϑfiHQ #4ȏIKN.HH4n6|)sRz2/7bKaQo"ĵ h:J hc=;3L_3\zɷDL ZNJ3IU$9҂ E(eY~bo]n 0 ߥ_KV] |M2(k0990G/fy$dž'}J Wp?/O ?/1*Iܵ4L3~YG5_.Yr<1.}fME9G3@gx4eD.Yu2zRn\ ĥ8]d?ӛI_J0,&}3~)xhKpT\T @ͽe`TJ*+|ʦ7Ao͕S;Wqy§M]%KΨH}MJ]^V0Wa?a?w Z>}}͂^m)ާk,iG߷c:pGokoy1S܅۸vL[o9tXR}Rt~:O|OrZI} 7ySK3sn=+ԉ.ٽ~_gjj<۾,Ny2D93`9I$! eGÖh 8BҙPdQ;;;YD; ox vqF#6Pu*ci0$9h"*B,|:Qz]ʏRgW5O4K£Uǜǃna(|%Y=^(/3nê$EvnMMb^)ROwz %5_GE#y]zs;s,Mqpo\ŸCqa2|Ω4rReVlqPn;;^gT+iAoS6oҨq.Oֿ)V)^jP̏gj6./=I1d}S]jfRIYNoyTeZ:DO߿9Ij6$ 2kY"M3-&yEVӻдk7m^>ޮk)Ջi_ c8<_d}]6ͪ{xbyYbLa%|yƸYqSYNQ}'ell&z䏓~JHgMZ򾆾lF9ǝ?X$=\0*aO{ۏH'Bd=L>ZpRy;1딑??1CT<<:jZN97ɖho0̷/mcTaq/rlYX;ЭWE kT V=x]JTR|ʆ*z'ȧd@jO)CK$H7&TFckm$Gk $Kb"YhWFJZ"L!D!(cAw*WP޾DouZ<ߙmy1-EL+rr9*)TqFeptt|eѩi8R6[ͥ~AZ,IVXiěpfK1 ťVgG˩$%&X蜭["Y81"REf)ỷC%36S9 t lNҡ;lu=Uj+b [P=Qwp^KqV-tcev*Уln|;k-Lrm^k(DhwE䌦Ac.o{EErv3?b=JDn|O*A|n{OF~8FMKgnhe;2yvhw #l1pcPn~^48u໲$o~u8')c-6#*#QŖ>3:K;Y#Oټ|w {& cE9D,m$L1h,̐RQUL-٢5]گ9_8|y_{{}K 6<*&`MTP1A3ƼMtGc2(Pdy5^THBȉ(}FKPGiRi}YeD xFKvEԫQXuFeyck hv>.@d4ڐx"'|b:3O?z6BO<(K5 >+a$s`)GJHꐧqPr%'q[epH!5)K&OD,6YL$KH$zaWҿBԮcKk ^=t;aJ:>Ҁ{z2w>ep-zDkyt7i$R~yHN_nq,~ϟp|Ųb {onB`ClF#J $p%~N?ߌ"lҏ?_H7"%e'J(w}/}l>>K'=aY/z4+ x[AFKҦ]E߆qp?824bc1J&?}*- og=*x|7kEo `|b|jLne0?nx2wkgsq*oK~K%/ 'B6p9KW|Ԟ< >`f)g9TX&o6P.RׁlS JrzY1-s&57k`&D?y iHXi[7wqEx{⑧f,RJ6PFM6HEic A4@ KU"1_6\6uWGp+|9I@s( ON9wZnϝ^1JSRIFh%gR/4hjt ǠX"k<_#lȐmtp~G3Vt*4Iޙ9L6DL%5J+CL&!jbb$xK7LR;b|"' G#$[ZB4YRiw'cAdni_fҏ -)筆68+d[C^^ˣ!񀙔| G<"G'Erne $3?^w-t%Y8XYw]"KQXZ-sT q*ܒF "hUfix{>eFIƨVֽrK d311X`yEvW)H,]: c'4H)8 R@jUa$mZH2pHkw霽" Zڐ@Bz;B#qq\\Ebd$fn'>@Jw tGVj:Oy뭜uRq'?4)npR|`/6p͇uð騌lGC_gKUwkgWSޕWz!eŹ̤8gs̙\$?^f~Y$و?(~p[GXqѻGc\7RY1 {ȭD M!?~hܺދLuWL]FS3ȶ*>.&8㒉kvvi`20;LiM:Áoc0E"DddjRpt*ѽBq8n.J`OA*a3a+/5pyvmY>>|*2gԶPeo:+zpw]z\M3aIZt۽y \b ZӛO07"kR f1dg+DwG[|@Hu![CT`iRчc@jeI'w3ӞO|_glJ"R$_82'CLYFpFܩUܩ'_M&'ݱ5;ءN1ٸcrejc6;K37@ҁK9_is;9TAǬ Uxh3QD 43a':u_҄6Q qd4 K)xnS(ȴ J_`4YRt+!C&G'UfCbՋp^݃J㛵'_vlDU^/e'Xp ΢%)Hd:2Ic31s" ;孢-{;%}Ao{73ԆS^/MGmξy#F*?߷*X8^K'%4&Ӕ%siE"h_(b,]E! !:&r>Z JyrcP4%܀b֓-i De# Wc.-i'+K{m'UEDpGٮ2!xrYJf.s␉+J:LGMgOr+d7 r:~> 7Ve*H.a h%:,S죳ĐtY~Â$pC@WC](k^~J"*X!tNqt 2++jVT-q ⸸;jrrhE$u",aCH^͗dr:l];bf՚& ް{i[w{$Z҂OK BZ\,#Pd%\]d4@KtN۔}d}'s) M*Rtv#c=R IơX+cUmOo 3P3?8sMbkOh`0}'oє=JSv5ȠTe,h Q%t@c.-J*bK'P=(œCRؔFQ >Ls> Qy#fզhLSAjq(jʨm:֐.2K  *I6w)n-R].9 6kc4OsE21CȊ.M\ kX#v!eX$U=jٍ3~ZT|2";D&J&6IɌ &dIKdf**AW6F$X.U" )D`);.FP(B ^m:{I'^E&%dquֺ0VB;(:"J IP䨈LgGHX^B IդPedIPc)KjZ6%]ٴ u}b) IW, F;^.Lw ;$Q: !(NяP&9O@WȨrUPMV|p DvhO7Ҕ1| Iɒt WzyJAr?)Gʕ\wis45JF3h$@v `U!W Z4tǫlWiE *>[c$E);/ȶO82IT2ބd ډ'䀬CuA;[n:=OK/+K~%Y#]5ك~z,l~<4Uq>[Y?5@wA@RĠ@TK-K%GuGa88a t~I}_pP.U"$C6D68V1ѠU u>*V 5'jO$eUvY; X"bMH^#sE"khlc6˪[b(U`^D^TE-"ME[kz57N4R oʉ A@%j`O[ƥMyw=2ʇ~e=۲{Oטdj^+ɳU%:$U!DEt6cl_ Fm]]mm:%>IjUh5+St)j%*m[DA5#Q@YM$oCpQ&u RB5[y2;)TMsRcrkukC'Ȓ iMORY.JžfPf1 _=lؐ%|KVʘ,1YKj lǤyI3B@Nhy%Ꮂ {d_L%cA[R] 5_)^vEkϘeM@0%_ [g~DZ(3yC&vxv6"<>ѣRPE Q Yg4IF#zE0%fJvOƔ鶑eG<gE)Z$sWF2,.c ӷWUR*Mn }(YVh"l%l8;Uȭ+u]խ9w#ZhJ{"٥[;ݝq;G}qa/6Nb-曛>}6Wh4*d~m]Xٻ+]uό13RMُ{̂\VF0v݃m_mv#|2_c q"RtI`̕H )Hs ;QX㝑`L9Ym Q9@ VlfdW2>_Wܳp DŽ//~]ö^-AQT"Zf-H]+^a]s8 f`*ɷvy-5K6-{7tAxHHN/uS.zJG=hdI'e>h81/1)łsp@La8Gyl34߿-ᢅ~=Yx"%ξ7QVS;(ưDƲ 㣩2jc_WѓWzBu$U oc0R[p9 ,.Gؐ˓wg@OLJOχl`b;ȟdYCt- jmTY_ntE~ YIҪMml2 VVU/MMd]`^͹[cӓ|Ŝ&jmYk59er,9Db;e&K6U'Z/담աч/dlG!32d/:1hb]L (d8!J9V>֜©_n7W#84⭼:)mTDf7N9 P^{lZWJR[bzQa蒒A7M)uv} eg^z6o\m6T4EՄ |e1*RL6>^| 8}M:ՇWT&n'aa#\ |/8oFG-<0я4h9!6HrC Q.L~\'L(& ((-fFurl$5y2<BF҆:"hL 'pś QinjgldFގC89~^6(z45ЊNRrOϊF ߎڷe558pgQv@VE d'$*S"Iodhr@NIY!*}VI+ϑ6aIB]ͱ[suq~qL6;`z+{;Փ_n>!qc=uiuv>˒^~]wVZg  {.j}hY+겿o3~nA= ۃp{nA= ۃp{n19^+U ۃp{nA=f$!<=`Ӄyz0O<= f? P!g4\dv|EK'מQ#u[iSňhV*sωϤR2BrIi$TD)-ePRZTΊj#* |e2ZV>;138y] z}P4W<@m\-Uс'UMwun͹{߈y|~zzQw%mν5JS,k+gL=A\ :Km*|pN J ABGLA EpاO‘5HX(Eb2 *m%%`-jMht}0 A;$TEşR>Ug6$fM՜{>#V p:A򸷔0Y,ӻ{xC}x;@GτU <:ʇ_1 NGbAd bBqq~v$6%(*JYxK(kWh/Ir'v?ZmDf/>,lJrD,vh]#?ºZ# ߤ.\+Wf+)eq}46o?}< )<.f?ez6۾_BWAv rl~<9:9t#in"ޖ0e@~R˄B:O|0*:Yԕ@݁-8u) YͣA*V㲆S+Bvhű"EĚ8GGAZًX&EbQcqR70ׁv EW]pV(-yWYc4Ϙ Z)޵)g}!)q_ڽ&=A޶8xhf;/O^Uw ?3A=Q/ָd,׌W UF X)bRX {5FX 7h̗kSiv-|qcQD|("kƸa`+Y}sKI=koH𧙻C@pn; n&`g &#~$(2:efuwUuUu=vXc e & ¼&BP5;LPDX5u[mnN1nMƕ TRRaZ>-,[mJ,JhT4Z : N49 !z$!!LIH1OePJ[Y(:;cƌL"J)1#cFs+%m'n JG1Aug;?4Eݩ 5f4]ZM :\V)|3y΄EI0;v]ƳOv^=ս@mJQIS.&Zкjw9~tзgZ2yVϫww^njZb[Ƴs cOwt\.`#lzO%j.h49zMʝ~1.>.1BW_G7(U!}2Ͽ_6Mk-{ϡW)dnXiCZi%7gigh[Ha$oPB*Ie6UM{k< !{"M GYͤRnd| Jˠe9TJ"#-[RRЋ%'=!r&c /ʠE[7rB<%Da x7|M/GHkn{ë^ q_$k='\2o\grv\՗㑟&G&E\–;3ƣ,r_'Ko0+xrBˁNc.|2XEoSwyUfrɀ`~,z4hx&F/}˦y ؿ}mWvg‡ufTk9Rħok5~85,CjM/!{=>\p^'ΚFw.*({b~b~7)k|cF4ޔf}ᢺ{_ yYG|5qpjb?F No8{}3>G)gb ^J5] lrˑG@@U#TE,BJj&Q Oeg7i&USRaТ{Nf5C@ Gg9ԲSbIIx-y93;I$!6eGÖh ^31T" дbhGe/RX(j'Q]i5/,3C7!ø"cp2.#X^/9w<=4qN1z5c#[xTؽ OD/[AX6+zeGB?YHr͇И/Wa>r$ξ-yFMƅ{ٻϮGp{,#v"v"Ɍ6I)cXnl\`)"ct~%qf: |:=y輺TÀP$aM@r np⫉70־֭p] YX\$sY [Z }ƦEb^,GefiB*lʒ6͡I%I>NXv°:U* (-pBa> ̸8aOϯaOb Kw LK>)`n&wOqOɟ\uHŝY:qPbh{P0vY4a#!2Y $Rz\TkXekŴbl ru̽?_Ubk>e xQ: K'+8ϲͧӒ ³ۆ̺/piXcM:6ݏvKcCƦnl/_[ەpyz_kr6 QfIZmJZ%υ5 WoF }^.3" &p7*0 wU:9&AǸ^.s.`')%G#K~ ͳl?÷J\XmzkLXEFoK7YKz՝Uw/f6<3OӒ&z*Ǘy]`%-?Uo70KP% eU|?Z\ӻ^8K/`y$"kY"M#񕚰Fq"Urie/׈ rbaWĘ!㴇h~w?|MFiJ)T,r hZ=5np*!7Y+={m^+:<<,В` ҘDΰD5sCaz0T_}B7ulZZh` 4Uh.▱/e@mJ3tny*Eh#Vu8OT܁gp!&=.<wtqknSQtC-ƥt'=E9J*aQds1(;K[tE 4{}{`3< Y+ jLON$<=.{_K}ThZ*q: 0Q $rNwpqb#-K\oKt{!D .RbHZ`iSE/^e%I_.SXOq9M" pS:] ɥ4G5fER0bz`POXgK%a==2ۚ>m9@dj{.eTGUu*T5삑Ί/t L%\G`< cf?}vO'Arq~a2t\hT0(J٬ڃH$Z_8K_,zOٳ!.xp!-`8R啒GtmPI^N˳F5_כKO0߅\$VxCVmeAVTk܂7r 7x&fP8&|=Bea| A>qe݋R `Ѧ C ʝۙq)`I7+J4UJ~1[tu^$].\놗j37=Q+- `{(\yd\~xrRU X( ǞX,f";P epK"zS?¾-h'?M%y'_Xjy v^7j)RQ)\İZM_y+527z?F6 kbLr]s{lrHbҺIa[qNujQc.h-Rm*SUUb{Q]*{:o*AZ)nw'Z*7"TrJùLF40}WzNQhYIkڠc桁\ٟbwkKJ;߭RxSU &K3mDۤIM#rNvCLiEbRN"1-.7:{@tFbh!SW'pODKF=뫵V',2W1vaמ&hR!TR2玩B}| ޅKD#N{'l~%B!}tBb@QDȢ}.~lW'_YRNsF(1#3dɺ5?H>_O͹ݜ4ARVE8USI{k%hh%NGTRtΡ(sRBL}d snE);IM -ڒBB+4_+ ҋHSFPK|s;u|NYbѥ`D)%pMNԵHM>{(.Ic 0g](tGhOMw_KͲoGTfɣ_@4ZE=elEJ9g'dHD'Nk!O0vu`;4V(5 I&jeWɕY@X4\,8քFnu+yX$YgZlhh\WoTg\עP`0XShǤz4yc֡!YVPVPB ׎]S@PQi:ߤT)Stvys'-ljU!RAJJ#LeFEz$؍gѓz8M#ΡXd"zbJ˲f@57J֠7 c:om@ 1s$J Lh=ͫ[(2χP9+:i,ru͚`prG.Q)bDjp`R 3k'zg`r@vΤcDVZ hަ wf+%Z&GSv2/ͬ5£xGD)(`~GP&zȮ\F+^e5øgE]%E/6tyիf!_Mr^KY&!EBi5 ]V+ 1^ B=Vczh2&M།qXvŌUDZ1&9i`>b %ؼ(NR&D_ .ìr Btፍ@8Ҫd :JrQJ%VI'a1'84;্j,(tF\8g@Z PM"LjyUA>xmBV5 u8my4P O*@x nGgp6᣻"YԏD?/AXEYьt7d-ŀb$Sƪ`|?ݮ_]7uȻ>YHև\dPe} \{#BK~t)}ʫYobVnF%<]Bp /sP#TϺ^H@ e@wPJ%l-ѧFk.j;`NJ@AK^BBU`C-D;+f bro鎠gCoAa:(Z!k?m,BZ$fj2RA ٕ`?Aj880"exX;PaVB]$IY6 z)#(BbAnw+itV0Mh%QJ"-A)5.-=zc#X}X@hf :G4yZR0 z0dkw/UT̆z̃vnF}X4iXm[s1dIT5*pqt*D5IҢGJ$vM(٢b1jT=kM!JK9O-'on0& ⮦gFo9+̈=Xe5p7$%<%:`rh[S|'Ѝ m$Y]fS:( ,3UbF(z DewkzD-UcW,Bva+Bq&SK \sD E˟u7(V0 .LB)E#HmW6Nd$ӽCbjƢZ'`ҕȪTy19Mkz^Afa=Xv=ؤ=ɗ Ug&d͖40-lF]΁?\뜼}~5kW!*Z}05у7[E05^4G bmQic&X©:lti =V&Eυ4SpfT9`'9+T𚮠:FX6\\\1F*5DqXoCn֘M5Qr7rid" J.YxY 4I"Ԃ/dK()q$d LBFt6b =]ш;S9w$^Og}y7/W;DzF& Jhԥb {ANO߼ V]C* /qYŴ~!qo6g_/隿~ U>UGi_(Zk@Hg%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VU@ @;_Fb@FFV*JQ dѦ5+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@!KR9r@0WF<{%@V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+>_%1@0闣 b@6>%PPRHV}JX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@|@7#Zw7gՔF=noo.վEY;zčt`kOQLg5/?u9[d;՜,iq~ZW:Nw'0&Naqjj˳!1w` tEj!"˟ E<ћͣPZou*G3A;(O_$+ut3n0ҋ!'-/gDO¹˓8aѰ!lL/GÓGq8}¼m1jhФ)" q芀 `oI O z~8CEnsm] *@SB֊!saJk`>v1=>VSo4UvZ}~2Fu-]AX|gzX>ZAe#"[ys}Y}$w-˫Wx*h[d^H${$졣'N_hzakB;:y*m(6u?o+J͉iujڶ9^`C %i9 l\sV0VmtҚ=m]J\p(I<CFD.C+ﮖoNiW}{p zh#32C%D)c_)HJGxry{Ga)v{&d򗋣 ÷Bx_v֒A8!Ѣ^G e}\m˫Ų/V-=W|JyC)n-om tܡJ/\> tǦMݝύƲ5/z,0,]&q,M 瞏=ynz }x ̫}$T(3Xp|>})A§J3l"=o¿Ṻ VqI=[\tbbJ:,L]]i^|kO(XP:+8powe4߭-;s.h顦Zc|u}Brc抎p:"V̬c̬=*&o7ɜV;$^#V?[<+_LjƜ͌]k UܽL6eZsr>g9+mѷ-gъj}Mzs~J֭C{jnVz^aQFk.̟m~sDnM/`s}%<e^[.ٳ׎ ʞgSvǯm}K6Jgow%2C޷<ީwGo1+Qc~>ܞVPck+ȑx~۞4%hpl}@toz֛6ٽ=eYQb=gPJ ~zј-vq[h{{f8.g z g:RB0nq1i1A;φ|=8eHY`Cʎ;;ўɃbS@/<$LgqN_z|^o&z2OGfLtdTjٛҧ&'StHvRР|ސR͠@n|{ v#N=X?O㫷/HǿN{e3:,{Z3Xo@\exETۡV/ #.^oS_~{p#yW7oyz;fu.Ɲ1$I֕s26'<hkmHeH~`^  `bL2IIvW5÷8$%$dNgꪯX&yj8 x[\⢊F@S AdqǪX2zReRt wT]t?S]t6q ,^9^s2_ "ns#mzt0nf}j+%4WOc+m`Iq9Q͒+xB=@)&sYʅY^V?]`s2-[K<& F|26h\/kF2[2Vێwz9`M,FkMUuXWKʂMA<6=а}V.GUi+X-+ rcxiDxꔩs6nf+LuU?f3vV!x}y!ߌiYnݸJ%-Б[GG |M<0[K)'Ja- 2ЦqMnGЧ:oYXps~_Ʒ2"u"IKҌdrΚ :38'y2ɘh4 =jBLr6 LP ("km8kPS 69OX]; 9݈t'#x.sE#H\SPgXcET{6}"K0)som|Mű ex @"Pxq}^ߊrP^?&n0}b8 JϿf?8,.  2b_X7vŋ^H^(^>KS(feY.^~]eY_;P{aGE#iۣO^X N77EoNj_. NPWX8D!J$f ]1ԃ52Tٵq!n xp2F~}>zYO]'C0TQpq0F_T_W+\yMͯAsaF;gyCy5k*6_Fv̚-+_pX@NB'У rNfPTL8lԷ'CfTr^2 Yugn^ Y/Xoqk':ayab`h.^_l%%c D{vKHrw9)N0?@bx?/z]{zY.'ժ^,^߸$ƭ4= B`'?=B `  :Qč l+z<0M6F%rT%ٰ 6_#0VlvY8,Cewˊ-[[4m͞2VZcZKOoCokFG-wtN8օ"쩳Q-)l^Rb6*[nj9*(K BJe)f*ɘ@ApA|fW|¯'sDF)ZHhЊɥ>m!Ǩ{vQf2k>Oc~~2q:rPmpRV4lM>f퇨-xn1svƕ@l "? ^Tcq}jzװE%ͤX6JwMsf~048RqlX6e~-/kכ妇.Z(eVSeS j #mi]G A'j* R%7ޘj/zS}qa=_t#=`ҚB3(TB8K6@sRd5{ P6]o^QUc0U? M_2۫IejCRXGaTB蚳}TgmrR77$އ-?$!euDwkTG! X㧇32<*% i!HScr`RhS2,95@s[MW};tM ]><]KoW';ՙj rlmGkB9sr0-Slg2PF,7J5eNvnaZVzt*Zh.`2zF-IHXv#I\ j.E—"e+hb^OZVi-yB0oPDF`S BR\hX Տ0cMV0J`Z@3S >gF%6" …e A;KlN[O~\PFVXˬ$;J%W2P,kfdvԮ š%>:Da2`,2c21:H"%[n2k6qs~̱S񁂩 ` Ğ,>1Cy1_K޻)-6bTdë4 9xZԣ5%.H >0\Kgrg88K|d&?gIA I6^C,m<A\N2ud$nu܁w'_⢸cL>X3x1ێ/0x@cs)7Q<^Ԡ@@4W^`^8buF6YTWp-CO1vHr%UGb$:P5ON ƫUtNJ8Gis)gAw 7Qw@rz ~'[EHEB8,iޙ(=('%fd`66)vp49iT=4 hD$| JیS&yh. #N嘃2%:ӎABNPO9ԓLU\`'$m46:a?g;"j˴֤{H+a] 2/h8 5[-=Yl /Qnvqg׊ Lu;0S"M'my7EuR$֊xIɹmvUv *kų󦖯V?忪Y`YK~%kc2.Heѭ5̑iWOg!OlG?x3\ӻSe"ogyǂ{3gw[CHs>]nd[_]j.Cj5iJ#{SS Q]Wnb$FB&|_tz-RzuWfń̙PMFӔcb3 kn nH|wܗ{Ng6ngeJɖ +gFPg-tO=󄖏5݆YDvvY/."g Erp{]d/_pWL1k7419u|QަG`ȰCSd}޳"E:V+(~w>p$*>"xarѲM2΢R3F0("J/sOo^u(u uw(mQ^W9wvNo8J9~ҐIH{'ڨE8z^1K|<oKS+a7fZC-q$O'=q OK~.dGr!2ȉ5Ș)i=0guˋa1 eH¡WmLT&ЁO5ZksHcJ*4+F͐nS i;[J/ `{7y$dYuMvQ!Ujkﴯ % zK v(3yȫ4Z*K"t$)gq"rM+M9HUrjLECAƜ>mdP`?R  CGo<}'\ٺm%p|  D=n Y~02 6eTuF( s){ɭ3Y"2>DZcFi,SXUP\Vkg(j[ oۯu3tܸ=[RW(x!{3 8L1QZIjС4ZHd"Ce:;?Aw(r +K*:QLso21yЗV+A/KAUݩPxx4_|;|F):mܧFr\?]b\wܳͅ 0ûߘxJW^0 fWՃڞ$w:ԜGRs}IHI#d Ȫ'Y# zdP 8 Bۤh!! Q[^ËR8\&'Jy%W8}7jA,ۭfʆ;O8Gߟ;Ρ\Dc(%$*h}f)D*XjGsV&H蟕) Y `Q H4ې!Z"t4p ex(@1 6r‚qϗx0P4f.w5s;L! }1橃\J$W\;+e+E.xtl;n?f#R rPtwp[tb?x݆l؞6vm̩{ۘ\-FմIpӏvRMq*аKeINFadLpG~l2Ww6O"9׋8b\/Kҋ)H8QT8 e _vaGruTIBfɖٝyhU'ݕ$WEsK>hW6G8ML ^}_$g3qܒ{7!*eqWdsY*lI| JP\Xd5jù30Z5[Z!.P9)Iݚ6N,.0VBj6]P;ŋКw cBpƸY5k?}~:ɣfT#?[Lg xgc+Hw mf|uz Iբc|xJ)TCTY B\YZok%nffJ\j_ǂ1JW_N~q'|ylqUkvs\r kvY6uSҖA7vfأC|u12fb>@q.#q<~7w!<{8.!B^nIWskdmlHu)xƹJ.#z˨e9rLjq)y.[GS=1Π'ru Tԇ%椗Ξ):{"01Pխ{ Q GGm\?A>KQ#fw"X'h9lDxwV1s.NT54Eg$V,k2~A*t6U.s ?,AwkD9s6!xJen-VI^M7GLԸҿXzFZM4e~S1| jRl+tMlV]ОrkqҧgT"-nf|1#0RrךVoqBFޞZ?Zoy;:km6^-HoZnR~ɰ ܥ|&wRU;~@V\u¸ƪnKmT[ٲGs(\"r\L9'B8k~l3$n97GRްKBXM Ȝ}xΑ32ύ)R!XAsT:%CHKZ7Ţѧ#5tmPEz<)u6.u:G?.Vd'$Q'n3V n憖NPK0iF ֖a:NuSZ@l hۘ`BYXI ?'vTol* %OʺZ0eIjUmCkȮp~N6_X&!94;3$6"K+Һ`=3ݷn{NujDA*ZQӤmԽs:EF :Fm>4i(\.nmYiR o!diu͛h4 cĘDji{t(,3+$"FzHC/yi!SWB8lQ꛹VgnYV1+!h+r{w =IVh4IUdPIɜ; #0rB.\:T{}2ឝ-ݯDh#!Y O@ "Givy~J{E!ո^똑(s.lbqMoē35qs6IH/ZƩJj[+s$+z[tz9)!K !ju}K3h]R"wLMZ%1¯PZ)|!4mG;2;xNuR4է (хk§tp}E ܔ43Lo¥:uA mY)dW(! Q S׆RӌT"yˌOh|z.`H92X3jV{U`Qn< p-Ρ]ǡntg*C dr,j]v ܋Fg-K7 5!1ѕ<,⚲E3Xa=؜uLsXcE (_;wMA[] LI2, P [5mB7)#Sbf 2b!]Lh(pu qd+Q PꛐIv4U2X 48J(&J@ uL+jKր,AmU{ a C@zKovrdDBPDfE[(2·PЭ9+ c804Ǝ\)[L3HJ vfM0(惵8[:*]:H BРM 2͙`(q seԸ`j mY#<;"KQ @;dB Aꕶku2J"B}e] %ޗThaDϨ`#/z ($dAYh8PZ 6j"r $C aUhޣƻ }Lk2&།Ҡ\:mVKQEn"#ŘQE!j$%0!bEfC<,]cniF ޾KZ01սu mz@-$t>%u`y>!+J$W{HV*X(THPjz/34XH茼p@4E"ھߠQ Y֐0T|ƣ4$/d:B [@q;*q eS U_}^jqgE3 jbZ1Hܩ`U1>D=noD>"COXVXkxhDP(c ٣P.O9{ڃ^ 9І(T—蠻%_>&#TϺ]J@7S!v 顔Z%FyxWH[v(g ro鎤gcyp (tg IȚR\(PHQMfWbed͠UqvqPDYUZU`Ihް-Ҍo "롃Uf1uA:( ,3AUbF=K'b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v}N -m/ 5^;(\b D~l'{s;>#'b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v}>N}%^h)zy}_\]PZ]mQWgO0 م{S?g4yXQ#Z曣tq˺W'_ӯ׿$8OjFOh? "NM_p'ݗt斟нiMl|/ׯ]ڷ\W}sЏnO@?1%GuDec2Eѳe'wFjW!DHp"R%6vmq-9mS@~V\a@"$wx[oI8zk/~Xih/K vB޾kUp^RStyvQGGmuAFFӚ~ἃ-ldSy6 wӣ r6]كL@|2ͫEcI"؉)&07e_ݤ=J'HW^R^/zi{WsWg8! NFv ϮNN>g69oT|]O]ɸK'߯an#i WT)ΤTBQښL2,2NS'ciCW}-\[)vt*6P^Plm0$iȩЄo;`6%b{h OMyn/<ۨ 'W?vJ^eγvA+_,A뛝5sz|N;ۚw]W\K0QZ*妹g./&|VnVo׉P FaF<{x,jޗTڑlIהŲ: Mf MN/zcl'<+ 24i& P#0 R:!X~&<,xvq x,"xJu dڙ7IFܵ-tk!ej2(#qaqX87 ]i;\,y$.j1"}O~pk ) zDkziR:7ib;^HyIK XŏecP>?^p6u<8"eeUOHՏR2 DO%M/nzGHoeQu)LSOA%K-V$w8˳4gQ>Yw4JڤLgu `?nCΦh[jTͱ zKl5@ 0sZ(5/Okm9.6m 6wC~Z˔V_9jJcʲL صDꯪ`Y7ͅlU̘f> j6uS57«+=^ݔh|]͠ -#R%%wwqł AG8}H QRF{wYu@@VeX3@sttVb-0$ɉԥ\ + LL b|9A2 -߼Q]+&H$lƊsU"4 L5ќ :40gS&iLpf13w`4hg^oZ< @TYVQUE]Yju;]HwڹZx5W&'m7r}4V;eT#@mU 21` hXωSYq|{Z`s;ݸ՚hŖi sA\$̢:!sSS<#&vQCtpT1`@Bao}3PVaPy{w1D<"G܅P`悷6PN;ih.%E㤜&sj5m,ӥDC3idQ`'($4Z_IʪvV\Λ)Y/G1bJ"1Wq{[5'-;rs^0ÍGVZYF(^ٌ2z\&ḿ4oMa KJ+XQ령 Js,$! LR{JeK I@U6U)@p 0` ZJ] ^)U X2l,lbcS0E@lzL481$b˃HV `rP7ق]9ՉE5lq̛B|qt{C@eٿwUULm97]QSCA0jqg^QQ c$.QSCwG:~ޮ8tGrtT5]E$3 ~oQT1$8cq2s|o`mx'}I/{;Y~dt]@ZyٱQbg&YfB(9+~[Es8T^:dPEf%X,x{\(Ya趬tp)];D,G%:;aQ~a׃΍g曛oǏObD-af 띡o=œ5 X8+Lf-=I=0̐5lBU/?&[_19s@PQql*^q58 kRaQF5 -|'NGgjIl?^y9s=#D-<22fCm_tz4Os(y0fd_L cǂB= VG`P?_;+?ԪLrg/%Q .-,3_ހvZm ++X7@0=y8;3eԵNCN:yZǨYn̢[(tf3cqǬoﴓidm6Kϲ8=Zu Bq,12+}D NE(ȓ#`KXnIg+s` .L8 uNɞފ ! QXDHLN4m:+ﰌTFA-82~_Pom5 j ':YO:MU .aTdm24D :#d|uA*zD%X Y9?:zؚ8$WWI20wXL=JŃg1MG-=snswwE{%.r.5ٶM˝ 朖6#uk{n.M9`sKPSS d &:tAǏKwNNA74o2Yn]rU7Ra{(seui}!?: !J#Cx 09yY LdDZ|a")bt[@VGʒ:OWiHf35;08nKxVeWwRvcRY\0 5hZity*‘q2:l F̓ iRs梳<`RN+xrgN񡤦&-5u3;viAKǚGPwtҥ`bo^5a`KbбHuT !{  9h3Q#5"ʭO^\udS-2Gût晳:-o"co^<ύe_@B~9w1Szjb1eۋXSh7[!x( \9Mmgkݖ-&pu:{U}VqnZŗXmAP+lPOlk[iRiÈQS%q<KyEݿHxBcp Z1ׁZ N_=,%W %0֐Ȝ fj1w"A_שiRY~]1b;S#[̸#'꺞ՌN?T̬ד:,}fApr\tPǓ 4IbK*?1I1=Z"cCBcSs2eT=6/MLO/ͭb9&$E)y%:Utڅ ѐ6Wa0}Ca[:U竲W޴{h}g[N jnm ,hgͯiE L9C:A:d)Nhi}tRs3yOĽMRñǞk$X('Lz%Ly`FƤ8¬UM_hc"Q]o3Hh!KaTʎ;Ζq2p}%?[3XW $,8?ˁ~4?+v]0h<<Ju]MNovl\ġD3-X|bC,Ir^p$82O:.ݬEx= D7Y1yލ%=, rȀ@Wmac=ԋ]|qϛm|]hb0bZ`|gP&0(A|:>3}K$htS&[8U/c3-tlBtr_w$-IyOW@ ~x}5WMΕۺ-?`ˮWxuJơU#:/۬zm:0?W.L?F֛,':]y:rl6QsjݾME/ \y5k#=LF 5w7,y]饔LjWYy ټ~tۼۛ}m^3ϯ B=ᕌ1F@N>C:<5k~9PzUR!#GpT9 ~նA~h.){yew]AB-eJcN|4L%"E̱MQ*CgĦ/^ =Q$\t=D(K-քy qJ6 ΆEc\/vS^^қ <:ݵDf-(+RM7(\ [&_Y$# (Qڀ1"BX9Kh㘳:I\ 81jأ 3L/B/zZEO/y6ɣ=j#zRT&qys%0V ɓwmIe~ 0wIn@pP(؋W=ROJ5h{ [aOtUׯAᭈa qK=rC0Ikrk3Ml 11/%HsB/M귰 }+7]*z'-8Չ /W.;ŝ}۸5DtYPgC xLdx/Jh19 NMK.а?}_'"uZ+"NpfohhE8 o P~od<(O?ԗQc<:F(v#ޱ#&ۃ<4"ګx BZ61oxVC~O/'OgYI;_븏Eu9?qnb2FؤQM[r<*Ƕ35(My-a^A, etqO/~Tyݚvk[$?k_V?g5~ǂc8螿V^(-3C@^@|F1J*P9+e$&δy+]\Kt#y7`}JKc&4=⿔Eo_YVL^ʜIxmx0SG&3MkGn^sˆg:gn7.0I]v+ ͇x2놹D<;zv'7לDvV;}P9wIC&+eRL=XST[5-((EMDBhlZL2˳7Sh%9zOv=נh#kҲYQhV"A0J3RdMr.7 lt.,3 cVI@tKԡ$TԚ !pXEI/62YS:cBF =Yo(g\}l}Qkx JO;d[!A}lPЅP5#HL }/z}_CN\1fFw\QD"*JQh ȶHeXJԊ'0I߶&Rh'$H)!'rH%KXRj88h7XhOho1޳ x5+fД#}e2EZ }Up!%5"75z-+|gsbPh08NJ?YeM(-=I8Gc qȺ{5:Akߣ6omy i/YDo}tiJ $j< a6y`N-M}= Ҫ6k|7|۹3W}ѻn49]Zys= jқo?oރlag_{ lk䠠 `mdiC1eȠh1B6TT-Z𱖓 D3W(jWzX_!:oBf$/7!vR[+(g!%$T\>@\@٤,CLVTh/rVMȵ쾾%rcDd$IɳFG)HElHP"a{J/',Wٖ&j'flX/=N@^Œ%Nöܗ-j$5D%p1b<>I3rt9|r/Fg$<ɼ{)h hzN17z7ˋI;A}wFPWvc۪z@ rvA~:q|N>L/Xt_LDE@,45N_=zukE"]ϛ(c-E߭sN^>߮ރ,sq2U7۩ͩYgFu9_>;vZ'6ugTzc bq O{4+rJm!Č%u}'&RSY: Mnrݓl'ZA Xuћ] z,+[Do:D &4Ο p_`-%G]# BdÖR*OlbI:H^ÜF Gwj/ԭ'Hu}TI'C+E x3<{ɧB'hUDh(M@I -/{o?Z/?{COW<]UAWP=s wy7N1-JJVMTD_d4E"c=qr~z5{YFp^/j9_P]^ӗ.6~/W%.m$N$1Gd)DAG[Xۋvmdٴ%)w/y_fԞŶfvoopEwGc`a߰>;6M@`Sc2Poql=L:4VNgUdi%đMhgp vxgZ۹6˳q=F^eQu}Nof_y7ޛUf7?3|櫯{ɟB66DKٰLB,39! "hQJWcSyDZMz0xAM7Jeh-qĦhcr=wAgqqzb _}ai>P$r NkEJ VQI FjNH%&ޡб`\P2L֣$Uk6\#z߳99+N|r-2+ЬoVz\*vÃݝ;euw6Ψ3@OODɛ, +c+@fk;YRZfbdPp1E uu+jzԫn>\b{,Ȟ+f#Gn̑Q-W0Imvdw}%9# fƗ(0- IΟE3.&-_E3y I{+GL`,IڤRJQfCglHB!*$n:$9oO6j[KK&#at-MDe! #[r2,~krD0ϜJγ0JO&"DAM.%lyPY@z#2]nh,R F'%R&.󆥤\'!1' AX6!,lW㐙 BN,Ծ^Q1ܑaOdSbpcl1)xj>E&xv??U6}1ȶeh|!yW6C{=38^1< yEnO'd#IV49lAh@Da$Eބ{ bynX[6CV@>)Ԫ]ܥ(2`xY@{ZKJSNP,JXK*%1/2&Hk,`7rG2wm^39s8+ Qõ@p賋O%\\~%8@W>F[쿧yOwIM袏QM .6*PG@HC2@DJ]њ-j{"fעeٳƄRC"}P=sBfaKPj^"&(;tsc"(WQt9`տ^oCPܢ.l= >JI6{XtWN['ʧg} G(cl`iB9~n)2Z=(Q@*꘲E+h(,J)IF5('l=|dn'eE5nwSSήY)mtO6M/Qd/ddK A J0! H#v"){.b a[IiC߼Cn|:ȒkqXv,Qj99.:5gfK @pJ*>($![A\8[j(< \Priwv` QDb(g {ӌ]} {_Vd%b1هYZ=mWl{l2u] &X+X)".HL.*8T_o7Ω< ,Q*Yݦ26+l s;| )HDח8{8 y(^7k` *NY;pIXV"cVLBfz[)U'aC%~l< ! Ġ}M&*Pr0pZV"1f`po<;RJ=YxcWh{vGOQey,lLdTOYD/R8yk.%)4Wa,27& |1T(K31 \J,D6U dL} MG yK#Y4CZMKvJxo]wP.xQhH YH$dH +׎QPC C.dV{bGqka6n|p5Yl?0.bpDяM%<]fYRhLL[&j\[kK,A WC*tc !DW1 j!Ik@(sEPr(\݊.AMy|GR|ȸ᫨/lT@%z)Q5g+̶q?7izW煚w\3f~?Rx@~z\gX>^y?Nw ŤGE dx"T>(\ VRge82H;UA8ƒVCЊ9îDLjcLg @/bQODQ5Ų= db! ef\HYBǾi  } DPT<ㅢ]ef $!e#p3^f96ꬩ+16-~дyqg旦4X?35a2=_oYмl@(mc͛i򴙍'^|]o^ŷv͋_^TI^h^YGSjVRϧQڬtPi_>Y8:^U0]:xixox%Kسi߿H]s"ԑh?ЌkK5T`! M--/5k)CYE̯$7](1kSf^VL-O&OY0ITu濧 'I{zX.MgMH]6goa zq!߯+B7M*Ͷ\/G@vһ7h*78vnv JGTvTRٌ{ gG4^DBҳ+ +hvw[pFfkG&VI1w#Ճ5Uƒ򫾾\쯫Lsr(+Y͏,3G`aW}4(V+実f:(~Dl ǿG%WiӚ^_>*-G_eYV|-wk;>'.3;igu7+U##*ΉAZ:Ӌ\'j\G̎*+X oU:U^@m:X75m.-6J%_oy8MUPagI*D$,Tm m1F —hF?v8T~R+Yx Xt.i&nUp0$ YDLźBHFܮ]j_FrֶV V0D`)+ΖCl^?h@%vuJʾT*0 SkY@G&5]u#9I~`cdc^ 5NGimխ{'HDN\7Z"Z#ZKMQ6 m4ib)pgQٕV1'k'IF 㯀pbQՐRu1A A9e'd~H+01M^GCtY}󈘯r>Shv|is|K f'uC|uVビ『i*Sx4) y= `РjmB!ƪ?cCQ"ƪ ef$!S*ZXdd U,Y/D\8{)/nn RQzčZnn4: R`(2 9Kr5LEkG0): KTb3_Lξtb3ޞΤsW+F uHuޖ(NxGi^B "&l>V}Rm$[T"jlGK\aezוG=RY2kF夛Җp%.vt~7)nbu^ԃ 7 ({XOǯG_]s#5iuzKfx?S7y;?ǟ=o{gdz5`wƲԳ;5+cū0ߠgnjyeef2_{Χjk@f4zc ީtTOאr>W Wf&9N_?& ({W zXoFu<,WXՊ|ލ%[Jzb9pl#[hjV7i\Op+'+٥wZ͌o6OdW?5 ryѝX⶷c ~Q^ʔ(NQ>q1OQvwBXtL{^Qo-SsNGĒP43=]k KI6yuM'%7/lUK z@;JJ5 Ä1x }LlӥƢjytF+U5 $IVxsҍ'|Œ *"f޸~(4e9l3SXB}!_=Uٲw@VVh-h{%bS6N/*|x˔[[iEJ Pނv4x( V ԥ#o^{.*fߚ֫Cw1*4l@BbҘZOFHS^hP"e*} .w{gAg{DJCY h DQ@%OR7IТBld5F!f8p|(<ևs`'YFǘN;-R`) 1DDcDR-HPFhS6 FL=f_ uoEpH.z!V 6S`SeU{a\U}#*Qaɷ4!〒p8quػ7xai:#OJX!zJ0׬B2"m":AbJp[a./oAzS]l|D-A":k1NJQ`V;)IA,s>25:K<`MbiȮ{5.^̸#nvMe/!J$j* a6y`^z3DijL.S4?lowa] o}?MW'B5 i~~th٣gt J}FGڀVF*yϹoTz?UюoSfAE-"[/oR&AB7!-hOa8;ނO. uR%"$CD`,¿^kJzy?I 4F&pbIJSwfwI"EkآE=gfof Ѿ [wmڴVs mY¡Hhrs3jO`09vDQ%Ժ@R䙹lOo6ulvoVi-yBr 'Bk46uJ49D BzL}vǺk8]aߑzvCscL4z ;P;I&VAW9g,|M W޴0zʃHkS]Xm5o'辺N0,}WL أ[0!ppf%9taz,VWAKշ~Wa,ޥOUiRrpkzzN4=N8K|͒7WbĪlX5o+32.(ov0Ί!"MчB[i@{]xM=-r4n$d7s"Vw݇t}jBaPN 9Њ 'n2I~d23$ l|6{5$'@Ps"]"`Jmzr7*~^fQ(pZUgW?\+ᦿFq^FT7E8=2kQ 75Y }yihͪR_fk,RtQ n0zCbOenJ]`4NoL//G0J5 Ԥ*2T GFhh}vx"ڨR9B 2`fW<.\Ʀht\ Ud=<8v|_y)su1fϵ\p)s!ʠ5zwj'ԣ'H}_FE[,=A#)RpWDyiS^}w=e ZeJRQ4/Y:{/ͷmgk>[rc2zKu /pJ"9mGUQIa 2jh əL#!a%IΑ2&1E*$>a*|SONɥdfoM)bW'R!ehu$u2bEDq9)QL]L~ƃQ~frv\JG ݝK&7ٸͨfR^l:{}RET៟/[΃PT`=@4W^`ê$+%!l6D@w \& | IfI8T{'8 \Htk,F <9-VI.}r@'C}ku+nH>\nW#[?[;`v]|Iwx'[EHEB8, Ӽ3{L%fd66+nh]F1e0R̓p݅J4}(Ϭ9^ԓLU\`'$m46:Φȩ2-3?ZG$.Ct =FNk?N,g{";K](qfuƵf+3ct tQX!xCW80-y0j|w5O}gW>+jNVAkZ Mi4%+'K)!9x0.8_'ư8ś7d'ʕOBP:%S3WNco~t#- ۷?pZV*/:aXC14GYiZMY}~3 Z7$¤f^&К;+B7Zƺ nP܌buṂ06RԶ0  \]"u68cEM)onJ-@"TTYB=UsOUen0~+&Ĉ']WvYdNɷm{h| s ZW8m/ƛ^bϓcq+d/\V +D!|QU?xc~2PO'Zc£Uǜ;7NJCПgRwI-̤'r#v6F1wW77[P! \AqT{60*MugYg8mg-kdV-i8*姦[mݐ. 5V;ޗ8)x*4#g5To+02Q, `=}>0n,)+$[k^&tp\RK*դRS ==|vv>An'a۴;_h%: Xe Xi`}Z|s*SM{Sy*))wx4p]j%e%LJ}, ʲ˖x\hʶi,BgoVP۶m.oFn+'Z2ÜL '_ͼZ)*,k@l=\֨VU[[s3Q#c9> |1R8^j՛%/.)]>jssFC>T7+|rmnM_0U:TLe)ڻݲ(+^d__!m{rl_\}T*]n# h\`]N.&rQlg2P0Q .AEo3 >^I^8;9>Dl .K +Ԏ;0:9[%n h_GͷVKi)v$(%:h†Cc'}@=^{nzvEڏZ,ݽsԚ(pIgj7˭‹Ғb2xd_d EQM&\eKlۨu= k IN%Ou1dDh8`>y'" <䍦Yd{D*3+]w}?omwA?vqWWg2{ AO+&/rGF;?5y) cշwG }ns,%(rQ6IޔPN2mu WGtϱ~$,gt1ZUqEQO-cZm3=H3oS:>?_5ZD߸ i!|MxMEKVX.+r8{RZ!~u(IHfau{\%J{I`3TZUqb*t9kcoۍ?8k"^uN 1၆B]Ϫс9䎴:r¿lȽu~ u5p-tL_!=Hkjnmi|}8l5m l|뫇s93#\M~`Uv}7>}ԏ+lg㯇Dc:tj>>7JGg? Osnɖw*oQvLdߪ=)?Jc;g! zɁ\E s\hBrR.."d+O`_>}y|Naյw֡K}; 5 bn̓ĠFPz,kc̙H\r.Dm,A#F(1,UJYhxKhɀq H$i2V:YKh嵄ϹBg}4'#i W m,e6*OId| ,E&SGqP=;UϠy4c`mHI؈A1/ Se)b``sA`)SGaRI2 )O9>{gm"uTܘ`!'$c 6&t)w8aW{ ;1!CDն0㉬j("Aϧ /\ dX J/5h|r`ɴ`=jjjtRyknI4H9 h3b?]HN0a:N>yz(c:zs3oI{/HC7QDrN~Oލt3/N]W#qwIBmX}hNyQgBYg$8y`{Yl{?Jc.F?_ bVi#X〓l=I6^%6U%i׃uM# .n慎f.zeCE\Op#I24ÇPTntTޓ.ߵ@.YhIh [~V.#pLW-E՚oaB1-a6ŮoFkuzw,QY] `}ł_NYb_SH{Kx>}VK%Qbg*ȟ)St~nf9+>m*Du~6Fy$]c5Ňf%+N:޻/s&67w,,~xcVÝ@2ѕs_I p=߮v{|5IZeSJvrx LFcAhѱ*;2Oh=ŋy#Q1xvNz$$rO;UD2F`x2IQ<. s)KcP,/-p%%xkb DYer:HQ9OpW(tģxlئ#G*t Z5yknА}tgJ $J: a6D&l*Zi~J(0w5HEFy%jiQBbިI|(̅RV-K}A_sAl%CVڲv.7>"JPpL "HC/"=Z@%PF!cQF9Fźpey4#Ц"GёS O󅊠DX0 ! N|nM,j<7)kwt $Q4VY,(sI.2H" $ HpB/{< E{ǂh}0jlP*kGOV,c[1&\L /@R3V ѩV[^'YNo]7i .fx?cZSZ+ݴ![ngIfX$#&`*0$YirVz쾢iZԎԳC>^mͯZ\g֬كo7 B*X8\@fUB#Eҍ7Q7`2g -7{9sQ>_!`%=2(DA )$QdKfX25,VBgY 0C4Й }md:0\eX#~CCFz#_SvZڇmhqGУBl1x7樞AO:dzS:j"8LW>'$D(@KhU%88Nsr2=%T@KƧOg66'$ʁNeFa ފ9f˼+as?sslMZ姱,S׷,ݷcg v=OMU-0m JI5J;3* iPB J'Rߠk8%u06[aepҸdKj[RN~-nR”2-vz︔bt&sȑc&ץpee8Z# ʽ*GALur9hŘNh#S\3+JHR|SvNlH R$pM@wK*[QVCPPZ2h7PYPsD ya94/%hp~׫BH[']ذ#$p&\-zw&;3kq۴EV>QdBC- (r>HG#0Y^"Ռښ\b= i1 IL ͹􌍚(i3)ښ95c=RMVSu Tn(*϶)xG@Mwu[-d48~/\c'](u9 RJ,FMQ+RA]OIWZJhҦu+tdg#Fٌtg bY~O1mp&R0vIH) T'^mjbKY0RBb6(oYm&ne\)&C,Xhb֜-ݚtljMVtVC􇨁C`D;',/ lo+)uْ].z9vl"i2!CEMdkbb%jN* }:@Ug­5n{ؐk1b[M-m""v6W2r-Y/D&E&'Jwc2 f^ QA V-ΘqJr2Z%r3>*T&Z$*ڎo9kI'jOvqTOk:[ d-Ebgo cPi%4cVed!IIL5*WhlT۰b[M!o K4n'譨4.Kww\\]v2휿#!GJpll9Џ<+̈́56NeQ]fs]D(=L^H|eqZm趚s7]rjܰ,mH QoasqNʺPRš>6Œ' i2gX1^}"ʍ@ ?t$()DUpj\(?d:DpRՊ־ uƤz얀s@;d(sSQB Aa0Z;1kb.FET7偻Lx&i6q~oArYڰv][nrGgAt&qC圱d,@䘸I3EdtI y1 8{`i3EwMQgdVPePxK]JNUф<]쯺V}sZP ="jv9ڥ]}󯢾Y zZ>a[褏9B_)Ȭ"%^zH/mT1 Ag%Vw^|fMSMjxV(L(b YrӲi%g!9ILNz@ɬԡUg!)ł2)h2cxȜgbyF)5 U\Me" .,90fotd)G*&dNlX .r!SVSm( 1?Oi44qk`3o 9$6(;쉌˒)D <+_D"$`v՟ݼ5ٵH9ghnٞ6hcNi֊'Ϊ/Fr¡N 7h5L-Ak^,H%J&:":l\BAC`܅acmRӉtǬRk5t"Z{sYqJhg= k,c֚s7nvچ Vv֎{Azxu!֖R߼v牠. Z[>q0}&Zj>_Qm@PB|u8}91wPNiƁa3',[ c\1-RK=ma$q&u(H^HFۨδ<6:QYHZࣃO1-/q[x-S.hP QrQlr֚+V4]Yfl5G-$2%{r.[&F{Xvg% -[A%x嘨L@5VmWl9FY2!bF22i, c( I(q{RhAYuFkq&l%Ϫj踪ӧ SY:TϮiu~A0U_/q0<Q9+Τ?~zUVӔ+^^ҳreڳ=W=YiɳgճwҧiFEU$p/gxN/W=x2g4`27}Hϟ?IRC:TCь& X5iIS\,;Xp]8M.UkQSGpHH6fdNI5OBiqx5hZ 0U; 8 cpZ5_aPřo?Œ< Vr;=w@hjA…0n[D7Cl?O nTzިW8q+e9LJF̟~.;ҩ'צv̅dz-B A憃sI}GKK 17#ͻ9eHAP UOg’CXN! Oi΁QĵC /bY؊֟@?1TS0mSf1\"tmAu Yv6-n,{ۭ{8-8͠U&"w%JIH0{TC.˝p {9s 2i3KK{הmqKElf*Ѥig& ,#"$YL* 2~ТE_ ,"~}* s lއLo1"6#f'RĬ#蝾Svծ|]K˒A"ɬJ~pT-M/-S*ɳ:D|"NvHv<ؖopMV9@;H|Dc_U(~2p(-TJ1]X{F΍طoc?kaѕE)Cb=l?ؒ!,m5D:N`솉{D y 1GOıAii-qr[h9kGC t=جӏck~s '}?^'r߿sqQȮl8iz :p?-H2Do?&:0Mx|5X)h@N 0KYM=!7ZeRA՗zE@ǨA*7>b:؜WTe# Xr:c  m}<9I,& ޟBG"yj%h<߫߇Vf4ڛ&m07\*!"H.R"H.R"H.R"H.R"H."H.Rӵ.R }=}q᪋"H.R"H.RCGtdűo`lV8io;``ƨ_W5nu#n탭z.?=6 &v!3"DwQD:3ׄ#(Ǹ92ǕJK>D\ڛ|fq8õh]>:U VWiP]}eͼ5!"m?M>US?'`liwd 3 KqU^҅Cbפl9$;4W|V4\Re6&öX;DPϖ0֡Zt0vn;n~ ~b6Tiz1OTCɂ( aY,90uR pF A&13wov>la`Y}nk)_&сR6'LL12f'Ё V82%1DBlCo$_8mߧ`wҖM0ODaZaڧ0I&J#k].Ö"N$kUD%J- E׾-}jە ]DBBIȬ>p ~Fd5MZ/g46(j:=ChqMWrr6?^B j+:L^QJ,>RM:Eϱ~Bc&gZ%@p~j\"탦Oc=4COwUd'|37HV1 Ȥ[XI=9XQ {iܴN;ʦQt(hRE `Дߺ|0-.;kqç qUžbu9hP#CS{r;Zݬ6Edzuńkyxl/H<^p{+KnUV˩7+]^kMNfcݦx=rkvOMRG$$r_G8y~]/F_hȤ1CkvfU[Oޛf3@fބ xc5n ϣBnWF/Klܵ;Ysm݌y_ pۦfQ /֦'snmq/4v8\MwY']x1͢uAbS$3S J6\dctsdH2x bݕ|;n8' Of͸/r^ʨ W dnãh DA4cۄ|>&cw^D#/-+^1bP!E T{hnLPGiRb6rl`(3Xz*`{p@'QHkP- -x"<>t ݁#o•* G#HI,58,|\ن q"rvG;O7g<^lU=4Yp1gD͂'DX_ rgj^ LI%G :VqPc " x:Nq$l/o8%!m{0oRVߍfSh6},hRF bo\ *8*uhfk`DC?_.[vUfb2OH7\h=^*_QoYd|ɂW1,t_}H{+" ( JV(̟;C Kʩ;R x134_۷eI=c\ ?&Ru_%eZcZ֓mT,QE%g5œydžMDtU&FM6-v ?Ak\F++Ex^/][1*X-d} u痭ёU_9w<&7eO,&dQ5d4R  R^uc9^gW@7[i%; ^W2.Y%q|ڣ%jRM̖Va\`FQAЂ%4LRA5ʟ(wغs}wqX?nEq|&ZͮG |U~L&EQ|VQ'ERQo4K1G FW y`QVCj\/٪h\}AC|rU1@l*H(.U?wqg_j,{=D2QB,mWpS6lӿ|ݽ@D㞴l|OxjK71,m'@pN\1$c ,צܩl8k:&->d'E1jeƓB~(6.Esnߘ5)3GsdtǘWC+"FMU *YxE(RқZZq><$gaOCd0qv܀(2F@/Yq|QqPҝ^D.ϥ=~nM'dk!;+̆SR$S :Kd8""5D(\١_g,𹎅ohcr\+%):Z{`"vL{S䀞NII؋Gw6֭ZA::}w,K Rи !kgOKfO5LLɔJ:O6YRa6wՁY[+QD2L (Jy{E$0^,W2 v)ڊ뽗,:oBF[Ԅ(Z9s!X#\RgFEgIumvENkmz@!V؃e5#\@IɣG)yh Qa+ ! dP1Obu^Ըk?XiRda%*Uet\*V)zg CȀ$VA4ASa(dܳ<ʋLCwWTP<b9*L' cVWc)Hpgг?znU=8 RbU ad\K31622cӥx RO/hiCU:8-n ޹:!enG{F-pOh2gny!w,8}䭷`^ŋDO3 F3B 毪Yz68Q&l2SHj1co^Ocߗ.NOoP,`=yܳŭh;ϵ]`_u:_L(Bq2Y(wG3GK\Â%?6Gg KӪ4 !T\g aZu }9j~咧wDz R@ .2pYf=tg0ڌ0_;(g\2q4aGgH|;f|G9)GyOgXZ"D(45N߼t:PˣyܜR3󇿎!Qqhя}&fDҾ^'';Yf>c&yhY(b)/⬃U`:{ Z`ƈ=!>wdBz2r&.}a:GJak%m{y^:Z`CIIDoe,!*JCs@jeI7O^_m߇轶xg7&wiϘCKm qsi2ۺ:pfDtrbE`頿(n*F!X{\,"]nGO3HzY 0,vgfaD$)i]EfͤXʂ):+;9"(6:sްl牊5S{$߫N6w.O'1dSa@>H@pC)3RTn4Y~JB@-s )̘sdH&11rH9HZRPj8&[P-qk6NJ"Y 0za\φEq$I^{OoIO.M=tzrN9wY1d5n\^Ix#fUJbM\j5K%^И"8^xIhh%N4ea-UHMц[eOAղ4ly'ǧm>'_zE=nkEy[EY>t`},jƞ5I94r-v{5o:"4|6MI$.4;AI4AS[ JHŚUK,9Q6GbmjI/cT 1*jZkȆ+-E+%VfJҔ+D|8k子=#nJd,<==|Ql87BŨGݠQiRj0X&jilF~Zu[rI$ 9Y}r%`8a͹C'}(-'= U.ad]v&廎NOޯ\gu~}1yw1;Zyx}uiP(SVy樼}Ej  L$HY.jZ%XJ@ z-<=mɰ4A9 ń kOYi RqXs#82 ] B`be֙tgiY_];INpĮQJ%,%; jՖʢrcXUj$+Ŧ&:fؓ^;%$I>do plhcrm&m՜툝96:vEm? Tcl"IX?t 51Q- =WIQZvcls>F^ߝHN!XUYIbM&wĹ.b#4WLVQjlõ(?߁q.8l|슈a0"Dm[*cHMYed#T5 ENm硈6P [ǙTRf[}*-BfLyÚ߾U't\sZ6JvE2q\\p^X[ɕ-[7;++jX|(-8<6:vCc~%o7Y_W9N9jg mkřq nq?>Q*/vJz߃r,Y&}wThqr^x:Ŗ((55w|eaF~G"8Aئ H Dʬ'4h >stItU%f%cӦ/BRAc7HHHL*Sj:Z<zTss\qǒAsr?%5rq =&9f嗟-`vz|qAbX|%_諤w:Ol+uNg3G {Z4 qषf%:9;xO89d!:J49S[޽|9.LVSsמ6zY0| A 24Ia\Rbф`%Oz`ʴK/<3``C琓ZgU߰eȅf8 JBQHs I%L K̐cj>Q5kiE 3} …9,1ÚGd);`qX"1Kiɻbj+lbآձX1HP@OR`y`QE#ͼ3-c:|.!![;Z 3$ym[NBпrp:Ѫ# B} %[gY _R.>۫I ;ȍ'^HEL$Lӟgl\l$7,;m`؅wZ?Z+UtpP!5q6LR`}`)$ˆ͸۬_oQhFc&t{mnyZi|r{I|< ͓B}R4w>huhE;;_w= >wJM6mLt)컟_JIJqb$ȱ[&ܾIfB|*7hU'僼v; {Ͽ?.Dߚw+S}*B@%O$Ar:qL!7쵇TJPjt1(}\>t~e[߹u}ߏ__vB}ӣ?|gxa*ҳ/ t__wO}_-6+zV4' m)ؤ\ 䱦\E(s{ |vZ}ξ{սeuԧ5붓m뉱ots[r|$^{{pr%3=C/h홞鸵=kV{tŜ|q^v>7⦀]%.3> ,8|Y>gq,Ŭ8|Y>gq,ls}ͅᶹ<+rY>gq,8|h=pSVg鬻?N޾{y:T Y)'ΓM)3T-Tar]-=ˤ8x;۩-Y xZ.lZH"d05T!BR|L>!,Pz3H7wVLVSm~nu.KګPOE4B1bzcc>E{-HJ@ɁK&c\ SPLȅmLBC#8Gp-qU?ea-UHMц[eOAղ]~XsG< Ӷɗtyעoy8+ϹYW,%xe.dbu<pM@u1f:%NPͨBPB/(%9$%X>`s9me.%pHQU.Ĩk~FUj!NծJ"X[V HSH]!hCQP sq'Q}FzcEl"@F(s5*MjTRm1sU-Z;b^C#@?AAT ڤA1[ɕ<8P5g{'k z2#Y?\N~~{,h}| J:~n1y;}Kr@v%}zAĂt+$HY.jZ%XJ@ C+?؉N\ƠBOͽw(-&T]5+g)HFaَ0,62vB7 킅-v[gٝ\/ms~wMn&9<<~{xkfĮQJ%,M ňw AԪ-EJx4DԬvC{RPJHa|&ob Mjo66эjvNGĜ j~Aj*1p6Av\,YUJ 51Q- ))Jn,ms`U!+3*:+iR)N8{%Q\1YE{Úעlƹ +"D\oC!!6 dq"*6没Tɢ-b9<#1>XY|*b8JlUY[VC5͘!Ús!/~Iqq5ל:q`\\mij,:hmbr%`)DbV)Al:ʊ0@u .>.]0>+)P|[n괾Zϑ[W6SAmg&HG SN/Vʉβ8dկ~Qey[j8ÍMͰxuvtqv~GfwNP1B22+ uq,p^F[I^WxM`Ǭ̊"te{g"eQE'-fogv~3;ø:w = I\f-DJ[\@()` I# )0i]0^ˇÚg_\ijJmXsS"t?\x@~ `nA,ALW *FuxuKyE"u2Uk#]2ZK޿"*S\d]-%ZұJk $=}" k:8 kȚ9 RBQl1xͬRAJ.+}$4:g3A&ٰA:b Meuo>wjΧӹsK4)c#H:sBPYX [K LY9bcJ`f gq@H'0^]g쌜xFM9ֳ@u_0XÔy-yݝЅz?|֢U DݴYuAv[zlWm sOSr<}G%|CTIkP{&澄og`bSw/Zi "^ hmRkZJT28 `4@f$3rSQnkE]"J4)停'PL6qA: ag")8#bOVyLNY9P^?! R2-EJՌhٌ |HYH$wWj @t_-l%Ϋs|%RXzKڡ՟oۦx2]~US*U*`B}YuUTqZAy?﫵_|V_/[^PEdX-d4/cULڥѼm/!ep遵jx;}oz;yKw)ػUo L ұjը(B:Xpp8~}KZuJduQ6=-9G'5$ӌLY՜ՠei"szKqf֏#el3~\DO1_V+~9/ԥ1?bu`Va.з}ʝT- (X$}_5ң|cqgOT)XZ dc S68U*i<͌,\|1Q;Z =}9fE˝&*slߴMsD@N=1^ Kws/ji_'u|, =A||D쒲%@؛ٌ}foD!UϿ(U3<%eռ\Sؽ9+3-Jw7rD}\# hV1ˇ4ox 9dr:T[3Zh4GݷT`Ist20Wů n'd[<=XHf붧Q;\ wV-m.:yp _<&g .}e JK m" !(`.WJRꙛ]RJz# &3S6^p5F҂x!1!!jfr}Ңq(}r lZ&]rULGd^"#妸1lX,&{J`|pA:,*1AɘeIQ\i(,I5IPR&HY'I,GZ~c/c$cdY]oŕ m=nޑLRRuH]iARK2Jr~DaB)sLJZ#y{t@Ivg/7Cyht,Q ̓C //#JaTFAszUi D k"dsC _bp.D2 `&>wEnׂ?0]GjjO̹k|"d]YVx۵ǁR=:z.t$ G Fj8DG?qTqgk2ƭXF3*-і j(Oә>}OȊk`eHX#86}_INk!rfomw0p& 5Q PX <8Z&$1)J0xY22 `'L٨xb4j#Ө0F;﫟 u{ϳjZk\(RFY...1:s,(YG[rz'<*c+Yvimu9nUڂ?漄foNˎ2{s dϽ~О=)@@=LVa\RF-+-T!6m|6\BGX6x>bwx凳 l`~ ˖P jq V[12O~55ẾW d >Gc]\t=20V|UCpG-rP1yeL;@\ Is%HV:Z ӳ8qVH:d,cr޸~(HoQz Bv y4%NtO\?{z+w~<%J-oӢʣ?CO'4=<,lK5<5o޽Mk'gCR{0=2.Z\E +0)|Qࢷ >{%9Cvrw.#zMg2acxvPEJ8_x\6v~q5c%gW/lޔn oT Paʶ;J_?n t_`*Z0uJ΅d:k) Ph %tJ~}Z(RJ-_E(V!$˸UV8bJf9qi-3Q:mP )׎ww?uIL@I%, . 1ǹD (HI iv?+rƓe =q $W"2Dh-1#r*Pڦ[}ywGE)~T5SΟF\VdNgtTF7:G]BIk5Wt/In}{^ΎhaЄ!&2A=2q|.g*$R*T<^.81@` \Ƴm1Ad6[Dc(wc9댜嬕|b2>Fˇ Z!8C:&lLY < $I z-R/]/ 3bY FIo3"bLb,CHMRP3gCbIϋ.jdRr,p‘@{ cqQz\YaF(_8Y8pg<^A.j:IR̂}@y'FiJBt]Q(`ԤvD KmgP3Wl· d)cRrT%!K,SN(ԢT))jxt'+0VP3W:\M˦(G\QmR/| 8Z Qo }38g>*dG[ez19u: !ew,Wi_6B'G[eGIJ6JtJ,.:NA@\%xi0B|$|D>; enw. "71J(-TaAz*:kJ9<$*`T}<]^)qh|EڅC%6KHX e椏15 -vgga+r)XԁF:T6ٓ"@ td@lU9; RϞ_P„%J !Zu\4謍.gVɻR;{Yvj3=BPTT5eg8nq2|Y58u3<Mmխeo&g]kP}ÿ;Rou@P:PNw}Yqk1;n(AL8WۏáA$Yt{UD;2ĶDzYzBX.4ᆟTrf쾜⬚\E0]}mdzҨ,@Gm/go'; &OnHFk9+9KUbS685ZӉ4xF׃fF:|#W/-_f% j.$##TwyP:o%g >}׉OijQeҏ L}.-ڇji?e٨)C7-N9ish.m9K?g?{WƱ?%o3R߇`}/A,CSM4I)o̐6eG$٬ꪹc5&'K2n\1f0wW# Upj jҾlL1xd3'z0lTP'dY O<.a\Ʀht\ Ud=<|^xę)su1fϵ\pxC9J` ŠePG`at;;"Ί'qS߹Eu[;ݎ17T D֠?iu5yAOYP_L%T3W)JW{JdmoCT? L[f Ԓ[hdX3,~J"ᬠ9[wQI+Xjh əL<NβV&g3MLiyW+J * @tna6/<Icl2G> qE Щ\DC=ԃ{US@RztٴGpf@^?A>Rהd2?ǁԻe(TǣxLI5VKTG:" (IӀ]n15d)#:ݣoFuhlr]Ʈ1 \/RVףt+෮oBoVໞ֕*Fzi| Vn U5^n> -G*h]Auմ^=LF5ͯ꾹mty`Xoҝ]5 ՜鶁+gCn{3S`#8kl.-gvYmOR/^{?SMo>\ |"i.(VyIVKBdmJSˇV  A;#L2Ijz)u'*Ev;PpG!9^!刄hٟZ"tW?a;<[VQ08rJ ,ü3k-%f䄥d`66OMf籃!ϧ<&0ff& g0SPf< i&"Ϟ0T9(C2ȃt݅ٝhGFg?Iv&w*J \`'$m46:AMLqgSRmvج?\4zZ&EJ[v}YyBK%@'pڏM'P șih? gf2s|4Y.M(s.!ZqAIcskaܤNEy%7us{z}"+jNVWAkZ Me4 ORx o~I`:jV^x3ӥ_1Z>e/\8k_}|ުUgmI!EG{d_'a$z?MFp%lnzrƯ[{xp49;~wo5)}u^dz!} &Yc&~&Ӆ*e5/&ҾO.)9bT#n ̉|I/vOHڗu [?SkPwp4fκ&p `p]q?k k^8waFHrv alAנ(0Xr6ީ1 FL& U7׀8XyM09nx4m: z_a`RVcV?0B!a3:&RW wvT"WN Ã4t6b 0搼[b'e.A { o̓|"ecbyۭ `<_anp"<Wb*ϓcq+d/\V K+=Ϫ*IVL7v*IvX3N:ֹ[ЋU:6 /vI  xxIM)szi+nPƃ˳pZE%~ '6ISiY%rRIVRYmq'S~١euzGYg*';>)ͰaF+M9Haj;=UsiVkӏ:>O > _g׌.kUf`D )8pEʄaVy)`450Ѳn,Hz _K2,)ɱ@C&f&&S,15sP1DYdTFH @O &cE01R UqbXXL3vBY E{X[.Nh.nu}z= toTn01Og8b'ev 3aьxS $b2)n8R-l*,10p{&W:(TQg툱 U9)T #v1q6# 㒉y(]L;vEmUU]*Di DjG灋y)0@A1M 9[&$zrelJ;y RV4@$J&( Q1H@\a<,&f4Ìo0U.^zz,GZxJTtZ,nHUFPeTHZ:"'"<8" R3  5I)bhsDkX.FJY ꊶh!Z{UX=rz1n;qiU c< LV_kXJ$XA3F쳮jC jV-&gMԖ풽eCv8Rk1&ǫ.T fU83?`jWheS$@\mϞ'=|{ԭQמQ΅RcvөLLgZ`_)HŜRz`{1kNm2rK_]ZSeD.4<8AxgˤbbjXq!1I[ .c,`҈.Ls9c6P2 {$6*;It1y%76e1& `<un`0Iۺ#"$j<\o}~;v^كn ny7=a0/0edZM 4C G 7d5M-A,`" n$6`kn :2=t@%E0 Kd<9N98zspH3J0xLI8G4A O/c$?NՊHi ϭ7:PJn-_ƴAa߾PkT,l;8G4*kJ1]tPr FCEĂۧO 5KB铔i=E2gKijE:Xy^rydx-_c>d*ܒNGG nO;BaO?q|S.;zMz=4$e2{z|IWcS%H aɫ)dˆU@ArbibB> 6LLe[Pj8moC܂QMFG>TZҡcX@ҊsZ#H2' r=v€ N&ǒ15qlC?Q|?:`mWIr$~Lig#AlV%!imQJEUO/'.,>Ycu(j%Yժ [?2=exy}4`z^]-m^W?d< 9n_tv<]Zfay*gg?jl@ .}b ߞ)t5L@X EGՔڕoTi[9G߽o}~4m0{}Y7=Pq`E?V,y7Գm/hdձefm -2MmtÆs+!3A*/vKHu5 pq(|GϴFI: bfv7l^Dq+lX]ռL&ͩ\X+x^ Dv]j77fb Xn5io(Q3ofeW1}hP$T%FyI*4׃`.q1^2f>>@܌xrvMNtP*G}}ޛ.jfTN\|t4ZDzP.tJ/AI4j0Jhkcs'x:gU%FK]+ $l]q_fø1g}<)hd ~qXԔ({ApLwx)C:EXuvX4xl/~\%ɡ;!;6xX0|-ތ^{)ոssm.q. p 0FE:rJ|!X$I+M >K]% _G϶G]YZ0qrUgRa`VUٯ<"*@d+Q"$-ZG;WYǠyc{x$-I-jRqc-h$s(pH\iLYMҩEKhtGLR^B7ݘ}b\;n 83R2d!6$J^j(|HZ#ٿ20އd `Ǝ1>Ч5E2<(FVEJ56m$gzz껳]B;[]Cp122{5`xٯϬ?<^/a"Yœt [kon֗i&ś͔ϧ Lk֥O X 2 Ц]6t_6zS8]"rj:8v}z{QMQj:x܂w^\=qֿ0_!4E썧g#___~'GKxGpk[KIuOpe.&O}"4n/St/9,ٛɛ? V)-NzzTm<-5T`}Bw3=цxM7BCCp:ú'1ø)2ś1N28H ӝѵs3av6gY("nć2 ܘe㘥/pոLP@aYSB@PwҞu"Rn mVFm$y@Ŕ$2/KQwz([uȌYfEgF#r"-mg:yW}B(+Q|ϢYH5smpf1 `^F! XQ15z$*qqC)s贅,|"zd( ~cL_Ǔ'/1+f<F_fL/H 3ALvH-W[wK#`2|zj U>uʆ *'h3h0ЬD**CseQŝq[N˱“r)fqpv|Y&[nvs9Ƞ$t.eGÖh ët&TYd8a<>8g{$:5~giQ qMC"I윋(G YJ7% t,߫S1;ÍNż\uS1Wk{ܭfz}Ŧ7bJ)'?鐙2jYt;=rq]ӕ1:gO[-Zp/Ǔ!紏hd>=ʦy4z@3ĊZOb݇-Vt;>& Pc,c&wțQr,ΥY`‹ټ~f})#,tSb7P`'[#g % X 'L)$e)c. HZ鄒& @i6(҄H`PvybX uCYKmF|D|Btr)zjztZobo$T6[w]b y$qJ@=P vqF N-ÁKIJQ%^(3ݻ(@gdž -1DҲh i [|*jB2ejJF9`RA`HdDMjD2bHETB@Ѵc: -鬠?2]7 jr UwjuTPu0AD0F$%(h x {e0 tI| ?GR6 B-9lY#8FD2q0TrP:S|Ovҗ{g8"z#+L@$ pHa X 4Jar؁qt8jG3iGrG^*6b3%y17AI NHcI@Je"49P[wm1 y u"|mXL-SsƨU;E!3Α 82ZDGK{ϵuWpO;Gw(c&@aȮԡ;-:AyWñ3@h$'} BH6u@! (QgeB՟pйe(HʜHrMΉ;P@|z6N!1T'X]N 3mҾBϑq\EŐAi- #|H5ENiahdj:Rb&y$L\RFe)A@ ȬȳҮٲXwMJ7=>;ް,Wz4 :wr5X0&h[5|=īt$)^OY -\ yUbE7eIjב HitL*"YʂSSNJ B' Lܞ~ݕ = Pc]}0:"x0-X` 0LEB`Zf9˽C3ic"?.Ѵ*i(ȆhZim(A Ɯ @Hҙ,(BꖐvYN`u)l@ Z):6f.}\rϏpO\BV ,@;RKʲ܅OЏOwf),e0lp6JNdR9V!e|zd.t? ?<|q~|oTu1׼xQfF>{.*di8%p"h\z.;M(JƶDYQMS")^7+zJ2LSY>_u$'J[gaia{gߖ*4F&ƅٻYp},#ֽ"v"ɍ6KiVVDžF"қޙfl.7L腡Ϋ{(8 kHփ+_Nin(h^'Myc*|Ң'AS篘6~7|8KE |nФU: ֿ6UZ{0uJJ Pbuن0쬬ra&3im2=?YԀΝA%٪jrAqDa - lJ8-R!Rm3- .*PΤԓϭ{]#x ">Z* ,[k|$;ϝ XܩS*3J"vq :_A=g;!W& J'`$ Tl>~%YI|Ϋl0*ӻM۩b:ЇeO_>EU5E-{0.K#OQYE=@󀃵8}jao90elsV#"[o@?;φe s{6"![߸C,݉GEu~ o{VP phIQ^ro5dSE"N#8u*2P& #c4?Tlb^,wcݨ5 TsB7<,~jeCE0. tVڼ,bVrAxvi˵y7z?mpx':ߌv來Mc{UZVr ӎ+R7ZrnQZu]4.5-EzBSXi-a_q63#oǿ7(1 ehGXGz׽b1_UB̽ Qk< PШR^[Ƣc\g8]%@t <3QKY {Ndmd*eK(ڸYkx++^+ V>+ Oij)9 L K[CQ.11DkZNAa9 lʰLqY̙C:<Ncʠv3R"* iFh>=8f +XǭVD&4KӸ% yō^Cqvb%8¿_UPxaBrKYo[ ]7HȿW]PۖfZjJß; cEFmodr/2wVmyCrjw.^P6Lg5Yu[ɥ ,qX!Jcaϭ:!+- qgh2P^uĊa;87"td sUz0-p?nXtuh-SuJ[Iؕi"-rRU X( ǞX,f";P epKb+\V cWط4 f {Mryߪ&)|ATlkTfYmRaP`VX a bsx+i+o&X0E^[a q"BkpiyJ"a0n? M90HrqVkeZaD]B+C0!$J;X xᵌFMFS,*5ewEd7XR`\s-BF0Fyy14)$<`fBQkɔ n|n   eB#wBPHts̡tt$"]`4dfz ~ÐT[K`6R)`TaP6C).BUqj|JF4L BLZjXX[fi#Ӿ0a>)$帥 }%0km-pIm /{Ƒe OXUv`.ac``D)&Eʲ(QRے|PGv}S] 9jĬH)sqIQU-wTM%s+)Atz9)!K&4}d S]RwLMZ%GYGW( c"_'LH/"6ZZ壝)TWxNIbtէ (хk§tp}E DnJd>{/.՝ @ #+ %D: x{jP]ju;"4#Hh:*c5.d)5>y _V 5ȤuՅvTlq:cHp+KP-WwV4<V Dw*<}*oma/髸 d}E XVGK 0"(Qw@ҧ_}R{IyH! ԗMp /s#TϺ^Ctq3 hcJI%h vօB;V` (-!\6BbFP,f,pZ{6/ DhQ8fƮncQm\g$0SZ(ͮJPHĴ ƁGm- ¤0,,H1dlDtj)#(\i2>Zl_vVH'Y g#Lk4ZI0xnAmJMKKު9Exw h ߤtV Hi/^k:hIUT&UT̆˻+FXq1-5m눙$K˫FPClJ''40z63 ]{KSp$J(uܵmT=kM!JK9O%'Ѯ `L Ly}D}OwφsR{R@k8nJxKT]ж+9P.Oڪ-Hד%\I"uAAvPXfR 4Č*m )w!1ւM[ [ƮXd jV"⤩Mk<)Aj$r,JxyŴ ap`R'J-*Fj3v;:T;]CTXd أt%h#*`hAgp `NmSƚQDnf7R'kҪTg B3e2>pR 9'kGi:?T赫->"A[Ҍl Y[0m@ncX©:lti =!W&:ZiDF9r'KD9)T5]A=u@m(6 qLtWA$\0 wmM곩\4rUf\. LDC1 4&IGR!0. Z&J#:ouhx蝩yd]pBk P GV/ü/{qbqzA# :@4.+[>~z+`[unh\6eӺZ.=ͦTw=( gOI ikQ\瞌@2*V=G%Mb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V=_%*FOI $Qd@jVJՄ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JBš @08ȧ<%wJGαb%3R2:J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%U!Y'u&a[t@D@QE/^ +/cJγ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X |@_oջًsjJw=no7ׯ7V}8^6W)ѯ\HsN>-Fl5/xA>3e՜,gi6o\_Gb.U9,_+Z8,g' 1NO-_ݗSAϭ.R 6,? >rxx|[ҵ7E$M] o/bVuޚ#庻z/ު.$_׫z/[n(hzBnj`11կ??y9;Y>]N;vr@@:% Rۗyf;-\m8j3:iXl6}޿׍oƓ?!³ڧ|Onq|Zwwyc;(Mrhˆf`CNZ ^Zmϴ#^i,-\z ʗb} QF(cݳm: ;R#4^e=oۮ(Jx*}9P Ey=񏘢P^`Iw9wP[XQgޡ;0?pyBV,Y~ԨvG8o"ɓ(Q-V_=tByܳ~~v߯nCWijus!j•a)!-rr/ Q]u7l"M-sX_vlW+EO.o#`yq~V;XlY|'zvw/hx/zl޻\Vo:ULGDN޷f YIykaeslpl9˺;*{ѦC/vYS*ts*g_z66o_XdjxGgo_tO:K)aͭ}qdaPSP-O.WU ͸G߾O[7_trptRyp(n.n}""1d6;3zWd])TUU/0:8eeuE9E0AF N9 WJLU6m!i6MHX9iM!0&3g8Ͽ@xK@Ȑ3sN[jK#  ZuMY So_'=6B0)tHDVdξ e4Ee.8D CɠV9=zL/'=,~[~|X,Oqp;W҅WRWr V85|uprwՕT'0F9K5Tuj&HjRڞ覻|OiK-N$e;=;gF\7Nɧy:9檕g]~>giˮoN^׻SAaւ걕6UM*4'QcוOYH(CNU |mvv|,75:5TPZqn F׋ԛjǮV tcCY~,15:"ْ kIGp~aȰIH ch eҍf:,eR[wK-6h|ɗ.f 54jو! qP>wYjmՓ7Lb7܀%l-b%J2K#ţKh̊Tl$A[w1v1F-J.7 iQDSs3*2>% y~vLWLg(:Z(qF0"mJF9vRj;Xu`1UkUwz4(m<}m}BNq5퍢.@,:愄d6.Id[E%LNQ2?IۿWik=qacMλbLF'BplCJu=Fi.k~jk24?"K XсR%Ln! tam]"Ydɫ{/XsOYYSB^E{(9+*-']2|I{9[>,iglB.Fz{ٓǧՁEn}\Xiyڼ/:o ;YyT~:>XW)MAB6wo׷^i1K[[Kz׽F]f(:x^FQ]-(jvICG_T>҇ւǩ|={;Og'倯]oGW}H2RwM z m9l%US$EQ#'@,lLWU$d/s"^Ѱ^o@BaPNɈcQÉt7'Δ˓Էesm(|zw ;H`m*#*f^#S^ճ@l@S$r=~h[$3WHX{j o[Fi=e>jg ;T)55 lZpMޥ}Tֹ D*@0ۼu GNhJu'቞rቡ[)# ,#,C: a mK)bK.*|l?;/{^GӢ_}z%'ڣh|NOmN'~N=G9.Eq5!FBvCl㶯_/?#70 twCi GcM:OZ0x5c!\s!<+w&<`}sHKwc}T`P" Cd::IQ~JA{O`׾۾6 -byzc۽.Fh|w5՘גcow`v>l;40m){MoNQK&]rL*2Q.tp11!΃& ̌RGxZ8nT,F*CfJD#9reLcTH L*UNɥdfoM)bOR!ehuHw.VВ$nJȄ:&^WUlmkWVUI54CtЪ48MfeKMFj?B)r!xl"yhQRGIQ_^&hzZd0M61q۹K>Kv d|STo v+lwrԻ6a5['+pN7[*7Ttr$ys%Hg`]ΥUC`:9TkvO5n2jl50r:z8]rӬ'%,j]̋wO޸ yinz;]ݝ }h~9 >7 {;9}+'\1 ]CcmR#\#l,mOdisѺm΃F=ޜnayefJN$nL}v7qls?*+|c3hhu\"IK}HLeĂrRɨCЕ) 2i<*% i!HScrGm`*Sfs(j=PvcV@ln>zYOY٧ơ|P:-A(* m +UaQgbuA6YT"^'$ | IfI8T{'8 \HtkG <9-(VI.]NI+9%}'6KvSnuH>\WZ<[-;`+Bse'K<* GR. gI7睉آ;(4;',%gN#xEDVIMvX|$9ØOAi1$ME=aĩsP,E'yڑ<]#ݟEw"8}bs=$^E)ɵFiPyBFb dw6EN-Ֆioiqp?"uZl}G](j[8 -682p3hsig.ƥ3Ź` gf2s@hi\&Q2B'/>Bo{娏3,o1YS.{S^*_SuUɳ#8ފ_GF8E-fߟT~jlMg˂>)˫c?{wbznÝ__^7>omx8a~ ÇiOWgnrzwQ %OӷA}żY^}nxiMwM X+~Mn}u0'9脟&Yo{ۯ!E}_5Ɖ0O 7K6O΄K/H7Huj꟩q7/S<&;Gp׮пy  s$xYQ/… R8N Xy#ići`L Q\`pGkz0- bQS|XRԳk4;OYp屋sS"HE9an-ҝ-eѻs# =74>^'^ͯ|[\4A>i|1}"Bp؛O]UMj2@nQ55v$K_?D9̹3yߋ&uvۑEa~O7[cƮx6b#Xxa,ݕ+ׄT+/y |GrsiNg0iYo>lYʖWN9f`L?è9)4T͞yG)M;u 6,)PKriglչv%RJt2A36dD3+"#`CY(!'N2Igip_Y"ZrӵjngDCRXEI O=RAki?{[x&6yi!1;EmNh-\ )x)eQP*TVIr\#:0/p eu9jG3V%0֔@5c%i006*6@.p)j:`73`/*~5w[ot'] ¶64>v[GJ8gVJ-m 0,k3nY;p}@ǢƅLFS[lGi-x$Zby)=Z0=QKY56ҷNKOSeD.4<8Axgˤbbj8ߟN.[Ltk:Cma sOd/Gka>O&hݮJ6HNJf /#γd׫@]Cz/5;~QӒI|X JpkP}(lbL4aEZkϙevݬ\JPkn]$KufAUpJ D`̼x&O}'8sZ|vh!A>eX'zCPcNr:8#R 1ԡuHOAki*kJ)@Cd b ''YUVAB+edlGɋo?{}F@5>:Djy҅ 8bP٥x7킯??-2,-GKK:Jfw𳃟}IsTN?6yI )NrNXl)arpTfoRu%.IVg>QAژ@!dPS@\r83p8>Y(wfD%\5˫>\>)]*4_{J\kAd)JG,^e⹓S(LBUGPROD[%U,KP mS&.0 ZR[ ]Nɥw-mI  ߭q%ލFupf )bUpH)Z=2`Q3͙~T}UwuU"QMHc@ەs8@ɮ X `rEfpR9ڶC}kР\ GnM疓o#L%!4* Dpǀ$9a "fGrJ .&Ce&Zj76öA(%Sz;TEhA:RD9_(`0Ǔ{2jA~%ESX)X95l Mo?6FSrU\%fTtx`f={M` g⫒(vh6j 9/ƗCiDȌeԥAj-N ?i7 ]\R EKP"p IDR &)=2x%k7'ˁeC,U)Dt++h@^}U%޹j3s"Ex61I|)@-y KP.[ΐ{-t-YW6_9;>-y̓BQg^wqQ/Y|C‡i*{ Rq[?a(*Ӛ02.F GMb)^4@iat=dC3Rz+wtq8't9(*/e\q6+迨w7(j #W#wUGA9pެ؊'IgOtA5Is}y4[(#@NDBgx7=$ў{o* 1>h_:$ttknPce O>"cz%늃oKVgy}5}^x#-iů(3=G`ezg#kU!{o~k64wqͪrRIMb>ֵ(ΫQw^P{4Oqs=g^^ v\ 5>2vOαhk .Bزҕaf3~Q:PT9 j2ew߫Y&Ǜo8fmZ\[Ew5nz8.a6%R8)&#O!X. ȳ'9Ĥ.hF>Sߺ ! ($,"$f&՚;"#Q4Ąd( 0^܅wD'-|ⅲUk[MFb Gb΃NTõNxZ%k1$(xP[վ&+3R`60*%,AC$*iD푎yS8:w 7Lx[kgHTݧk\P 8|)$ȔTu9ږNrH#iw՗ܨXti8herhxb % HSJ[HdLkl/hCL$`!1_3&Xܣp&NCgU |QF}Zh V>ߴj"{ ㇃ c>KϔɱcGĎ~t;Ki G-XRF_  ; `9j~;}ڲ.߄U hՑ$΃H5u s)0$'Ȃ8;ISaWTv?YyԠ盛 J,T.Lt$ZX65p&D6:Ў@YeVxrSVp'{-Ll3>?^xUz]2U34ԆF6-ΖCߞ ~<[\Kι|ʴ JwL4Ғ3ʼ^zjGCymWWS;"g˸ONa5M8 Uxnҽ^cxĄ&%YZ)@W&GI%+ln1@luzs]wy6#W)BRV2U^mvIV5u◜0eFZ@_4j}qx}tCz23@k5x@و{ri =N|%Gډ졬l.Ji"s@! A J6l7Le1~s~xkU"^elU:h.2a_qy sw^wO\7./Ӫ /ozZmmmhzZ-qL, q6CI3U6^'Y^ f.mqfeӢ.[6Kɳnyٜx{_͇KM=$B<\NRL ,A9fC-qUAgѾ?=TdQ΋R_sx~9wM<8)6dtxCIfdN! ܴy -$K~soY3=Du_`K/^uJ: o^z*xEݚ^(6#u)`3nqPmXQ! y-w{nY)8I/ǞؽkΙl)ER/oϙh)I{:EYu 1 ^$2r<<=#4R1{|-"8.b8ԫ0<)۾*'{_ (y[ۉE҇UL$@0A)ű&@9<m9%]Y[n4l*F#J(D; &Ɛu]88@,YìN*Gbjf d?:;"Z%K) უL$Q<$"VsȤDT?{WƑe ]̴]߷J_ƛl[F}ښȢ#ɉ3~O5%2)f&e歪SS]Vib9 hi=Bqؔrhl6)J>&C)Fᨐ=! Z&wjh,@됔Ä; HS)QhR**RP!]yr@cVYKkZ!6;_ _[B (D"'^H#6`˟M|ReҖ$$됀,RΧdBvU7y|B \=oN ʒDM8UbyNTeh9$dJ>!:U29a6n )(c&l>b+V 3_G H"NE -` vGo I:gY笽}hhUA1(li*!KFLE2uAڳRȦPBTCoT*KIndjP/ jTޚ~uP95AVXT`t[ 9y9kx9]iVx0QMXj]rw%#ɠ-]̀8hcn5i*XLđΚ :B̆j9u!ZE*){S]E^g U@0_b VXBQհ=1 X@@[* ջZLZ*-y9m'-mnh+઀TibQ#.C&~l0:%7By vm%KfP57._s* W0c2YoG 0sA$R @&XM鎡i2{JЭ))B sy:M;‡ r`L`rAI`gxSED1ũ2ФήZ@*)>C;%E5n2X@̬=k!y 4 JENM %\B)&~2uYj{VR]5>ЖP4f^>ѿT5_Lt%J̓8PZl-@` $zu; aD.CB |KM&2Йޑ _Act7Ѯ{1W̚ (N+ƈM$rvNRB!'̿`#CY倝[w&^ǫz9o}?o4JƬ'[uXZ/mZI|tx !Ձ#/MAy:Uu0@rtr2c $w(v5ef :ȃZRF>@H"5T^1 H.Lkc4o=GdD}IEPV/V(28pGx`hV& :UQ٫U;+Q<Vɒ.$)@t*v}/;{ΣN*%->J}}6Fx2zv))` 5^̪s  Q࠻%($0&"Gt 7@>,ࡔJrtBtX xf* lw*XIvVh# ňV RA?:0'(YE_c4,tfM 3%) A?A. jCC8("*p(@PJüO(  ΒM@r"M`kZ7t} XWHgYtg& j4ZI(;B RL_z+"AM j ߬r<FV ~ײW Sl҆mti:X8hv^y^m^.t:՝2s$ھGhd0uKfU::lѣйllDQdQf1h֘U}q%JV. dLB9=Ѱ[J3B*&%<%*` rh[bHz(GȍZ{"U.j1uFAvPX&R4**mʀ)7!B[=l 'XQWTC(M@᪛r[Yy=e`X071+JuY1&Uc 4rv̺̀l@qvchC5="|O 8">8ǥhT^#_#8ݳil8ǰ?~x7R]\_Bhkߧq}ыrg; ;Kʾ{C<[pQo33#>;pyx׀]N6Bq'H˫V?+N.hW?6?"ggWvnm5<9gӋ$xAJ 6=yӏΉ-y@:\BBxßvo =yvr*_{4l<*~<޺2dTC`T!?R=*y,h+]Zl<v;>S01ET*H\wl&ӷ0#o#^-pF(c5'ѣ}(yzS>i^<݋:}wﻓqM[rŽ+XrŽ3ulTdSĩ8[WӢM4I)v߷gR!i 6*)"!`}02Mː Vo6ARЁم=5ɅXoR`}f1 ~2no-?2C Kj9~}rSy]nq֠?<!C8ߒ6ٰZ1g_-pgwۉ].-Ba7|gMH.:e3U{N֊R4\~d(#-R#P{?_U1m4^j0Jo,88'YPI#aj\3Igos <նWl|e5tmovVa]47pWH$㨽!"P"!)/-:gtWJ=6Bf;Ӧyf&GP%4@ hƫX2~>M>of˕ gV;vK{d!]{XTfW jHHcc7k!rYO<< 5}VS[NELe`<^~nk [],_[jJ&uXqL?c*q_5 o <1YG2?~oA{ϫbЈZ$)R}Hji@Go ?a C0 CLaިOoLUS*_xjO4(fnt?5EI_k"eSѕo)kt&q7,q%jYG_A6?!1$#)QLI{f0TgeI->Eq%,O aQ J4R94JKB Ȣ/mH-s;3 Xqt1[dMDH{:9Wx& 1Xxw5u_DXvc5$ΎaFw=K`stdKfKf=f9;zrXIIrTIh䀢bFǗBwm%u0{ObMjFITSÎ.9j#pRB`Q$CHL8Ҕ0ڕbꘈf!}"ZwܕWJ3XKJ<[9ږ`mXcEn?W|Ե{_VB j4?55^ԿW%dLm:]]js˄;^'B!p1c߭FЃe0NZ}n4?Au&{h0]%73S8~\: $/0N$؁I""Y!LksYhD1 ph!p>d$'2QXqhj^Yu9)B鴍4j&NW6Y.`Ujͷ@_-3e^{_e0?|C;R{R8=TzQZ*+ K_銫"O2|s/bLBiDf)ʸf$)(ibXD% [y>UpϗMu k51REF} >Pp+c4a!&:Y~%e.^gWao4I}-&vqz dm50"F>8@*﹮x /Ժ;|TV`һvT/q .ݥ,52Tu9bvvDi8Kޖw;3av}(qNIRZ9 SէtK/k%e^erᥦߩ<6v+n eJΩ4qS )M r:&9ŹSr7G(VDC2uhwpr{1k0/fҤD|_ɪMѤ(kVëw}[.ˏx[qq@et2 MIeQ^(`*p6Gީ#68qkKcJ${%8XyBT 6ObJǔޘ;DQZWdeQ=wMrGt'z'1eh۹ Z GOO#~@`8$L~kg0d=Wzv#zIYK;dw,U }KY=/Է$yl5#B;ֱ~v7Rn7'}dgMS^`VӒ\5q\ؾ"ZC+iv-p!c%L 'vyIشz^ .ݥ,5}O,DW ciI1u!c W!O}o}7/A=ln _Z>i8gCph7EWb|\9*XA(V1ce=:| #k]㫧 ٩BҲ 䧎jJe֗v(/nj<9ۍavi=<5s4zDٙXKwn݌h6HݡH$MtYrcD!~$VsWa0eivIU+99Nsyp5C!e"\*+bΙ?1N4DW@$?bTRg!gdyؿOC{<4tI:/.?j@Z.A}Yv pX>5T`i+1uO1&|%R{ Quo~;U(XȊm~#x`h`(on?MqG5eWkq _xQ@̩XK R8}^}8{uTK )9vԜȾ:k.MQ/EU_9** Hq1B/^HJ:(GcRF}ZgЃd%O:\7I1D]+X* Ӳpc,&WoL7 Znˏ^4\"Ͳ$KbV( %2Vף7U~Ru+=M|d;-%X=b>6 J?uCU8&`#44ֹ21M~8;qyHv`S?=& 68MaODs5~wy g,ykwy$/ڀ.0e{[vuQ˃؞j}D9VoZ2@V/Yq>2̱p8ɺWV1!7(-F:S 0>rz*5 :.M̳eǷ90#) Wd|^t% ~eΚF;{ ?_4/5IيS,WIƣzqd-k`D0peGE0)@-ގLg/xyyy9Kcbr3ήZXrJxS< &EƃtOsa +$лoM/᠄ ǚ)>|OY˓]ì1u'Ҏ7MSHx2gǀ-1y+rA%'AK+T؟½9jpJ |P&MR%HmY`isu9P=h fG V^S;y\2f!Cp<0No+XV)=f+^b >>'ý7OlkBp]Z Ãmrؗ}߱hv b+XWaܾsac{J!6'k'[Fۡ> 3i~eo4R?$yGm;;N^vBDر)IWoNYW#ق1^)ԗϱ9&eV!#>恊c0ON%Yyfeq*[v* )RLtb\/]װZU.IVpICiX|Au@Ri4<,6k6[b 7-Jw%jub 6QcV\@1u&"M"ǎuK DMzcKRnK͈ڱ36;l;Y/c~S{&n5wldjZ:H|TV ޾.^z*0_!t&2,jwd-Eg0]<5U8abxV:1ʒUM1\U}a+p4n&Hwɯa})hwr1vn}RJ*FblՋM(8t5X"n#y$t!+֑fD'4 yZfx}_N"Qi9u+>p;9}[t >}BĎ8䜿E1e^lgQ6Ɂpf~s4q-sIV4X\N 0y:u}۩=hs#D$>lU+/(VQ| \dgE3#ALw8 /:eq vBEq7^DƺH;5]&QnP]z[kw`9Oٚ\6˩Mb zkwM#As8X2~W=- ILk*SD F85LW^OxIƉmA8?>zlNYCbPhCf)orNW/*NrAİRJoY{8FcoL(eØ!c1`vcYW^$`E}AʵI-|<ъ.rDZh{aej3g9ٸݟ<4aSݥ&pJ!J9# Hˮ#_h'sVRJZ)[dW %M'h'VƱq}%o߻iR x%*%^6va1z,>,:yyȂ I6I1@.+Ƹ{'~1}/e2a0pd`t. %>_>gyT5#L7B@s.@Oʌkd(V8 4qkTߎjD͎#%yrL+?:XxgjEWBjpj:_ѡ Kԁb<HVW-oޡ43i qp$]NX%yC= sC2#"޾2_GUY,F[a_Zʪ)%|2V2q ?@JiSS?4)tXP=ƶ^ԃ4NN.~Zn=v )0F {/rv=< ꧕WfH%*v y1C͛A֞H UYVޭ}hը{t1EH6RFE1>ƃAOwIP̍ )#x]S{E+\ȊֿeaF},YGƿ- /ۤ\OrW_cAYQXz珿[ ˃uM)̓EJ/"{Sq$m(B)W:d͞\nV鈖tݔG^K4˖OXU!̗&3^ FUV2'Fcw\hbO,Ć9\-jvd|6ݱ l vgъJ+!(9EU$Vhs/ۺk.k ZF<؝W*g+{)L#njYCb"Dw:?+EN*X_:nGαlvݡ XB]w"/Jy.!VE3[N9l1e І{{r#sﳝ; . EjN 3s\qB0߶m'yiÜ})>w_:ƬD4 qR(/3֜,XW9ETeiu:vMp4wNP{aKM%֝8H tfD^'aRP>Ϧ)LR9f/B{D:vRCpGhvu>;,T*Y^*ۼ|:0Y]U"wz5 ĥ(&G(%uOJ@jeveAk=44Cmˠ">Ϳa{SxZp uq1=ALJB.HOϏS$m/R]::x K 9*GpI'͍vN }R|"b/uSca'@2}ӎ ^2|åNru*H,e/uWe ^7;4"Wý'vu1,)vˉEjyzN"<qR$F'wzFMz'vqwD޴8_S@+Nn=f:޴9y Q?W4"k=3 G^pD)gcEyMJ 5dgPi*}yy/YXu%,꫔Ph c uн4arj1J[}6w $*cUPBSׇ3NX''>5q'mu-9zׁ 7_IKrι2]!m\.guV EseX>[n%i`V6q_6$i=* n^Xgm2zlzrrkPA+ȭI񹘒Qz5[> ddCC($.D//Cٰ]䡕BBy>mVrM>Ti.àYV<76!.# X"+jp+nWrf^5| 3vjH䊋\}. R4PAŊWA4GbNKOQJ F :~]ycN_Y27]4Ѷbd/?,^̇0"խY!%m6ѯzu[۟۾]ۇo>#܊}N~_P b5~D|2^p*)( Wq.lKL2\$cک6]C ,Ñңc k_LJY@zIfhgtU%>s}H$v# n͜㩦s/$ )4*E9${#&W*h\W]v;aHq;NAN.- I]L,_f CGmP/U>=xKh;  }H$܈VQB@fW3k3[,FhY8" ^`iժ^."s~C i̴m`7V}r^'eƵE(7`5n`a Z%xLfXRإK ["D$YYE)]k) ??4/s9/Gz=&2b+dcno}E65i흿$ȥ1Zċ`F_(8)X¨2</D(Wx [U\ ${+¨:{OÌ{K{*1~Z4ĈT2YȨtg}2eK$JjI 3zǮ? P}۝/NNZ騩7+pұ|"r x!7:nbqe3JҌB >#*NȆB2I̸5933>: sT,pA1'26X\s0{A;t /XYyHL*TEΨ)O̯جqh'ZtY[a.2i_su8ډEcYQdi\YòHEr.2ܧRݳaKgq%J^7@$869w)bA8rF^ɔeo˖,ެv,o=bhB}glUHp̊(-<Ŝ3%-9[D%8_e-?rЉq/5H^6ԯ˗by65`1k2Rv9L+o$62@|*u%fm l)A)ܻ](<_w2\A0n-H2.2J+a2%=tjzyInpw*C8d6|:1@Y^{Ҁb%RQ@y`%Ҡ5\XpW(^A'{DIu1lڵF_[+lбLH.uNk}wv0  0 ~icא|hFbddf7͓de5@r@*'~qS wk {JF-ߛԣbeH>>l*J*0:YZKڀ!)TʺCgTiYW~~mǒ2F^o9o6e( ܈ՖL_/cEpyܱ7f*.,]ܓ `'FA}J9B!+8w1pza,gF'c5cRt ^uц҂pR]:7ݑ;h4$mXǬJV •=T8\b 얍i8NQ4! \|AYhxQΑ@3A:x5w7C'IL$&s.FXcXzA)#p<(#Jh-#Br:k p̯wX,'=lU u:G usA{lsN9EWOsbluw]ũ.X߲Z} /'~y%U@)c} E9ǟ&qs#5+n PiP˱޵8#X wl`lfA!{!zP%c&vCmi[ݣ*#UŏqX4W0QHG@%DbSV2Śt<ьf,L&4|9^J;d }TsLS]/ +} zvX0ԙWFߦzu݀ԗ[-vGmakғ+Lvb2uZ6{ <<霝Mn%B>WCA|N_+.;MB9?rr*{ܒē(MWϊ>HTt_KM͏h1u(P XV}wV^2~:X q2ˀ.~W!0>{Z3D(SXb, Gb̦ol>Jniuf$]PՀ> gY#s-w<hb(f "yP՞X W\KRGGB7!x )iez곭Z|'Ad5sΝ7Gy8di)=Q)]Oo9cȞy[8 ]O3;oukp Np9)cфh>PG3 HbES}r4 !LSLD5mwh`R@m'cb$ v"a:i c?zw¸kϽNȽ1a4\cn*8Y2qMS[~=ojd^)Vhs>> . o:qD=!D+L83@;#?f.1c7v *ŤDo=Bq{?w*ofZJˆ2QXoMy1Da_<-Q><[F :n聍cn:f`DGU5(I+Q|kAH$1q2]6)D2X&l*)0npJ1. jVQYTgӸ"VZl0;[rDݻs)(o-4 @&0VPcfr=۝}Ka$niB3c"`fdU; M8kqKT4`҉(Rv_I.MR 1ɺY~*m᳤TÌ&*kgrbYtR)O 9qdw:M}+Z\s04_3ʄܞ Ɇ J'E'EA.gw 9AP>5~H]BB(?K T @dlmIFcB#1udMή88e>2(Хq^am1(,N :a{-w$ۈ7PJ3_VS, ]x Tf 9y:_xv+;Fcj.pÊ]LO0DRzw"0&ν D3Hm6?>颴c>h6t T '=(g(5?峲YtVhʯhj4ʧ(&_]Y΋OC{):MK345)%g4F]̍,/ͼi\?_ma9q^v4gs*d,x4u1{%RaO$ޘf[L-IϞȠB!1I7V`0 SwfDhLPD]kp}7ߘ3|t7AdD/>:lԭT+WO%OZ(NN!}_[h.QݯIj֤4לD5) 0% I监r淛uH=1 0VUwUCAp  2 A }%&ݹ-M{(&Oﳗu/|' Ŭ[ =cipBå )4ݯYW5jh"|58M棱.ZO}ax$( m 'X)IG3MR3ټ43wxQ ZFa ۽v2k2^5P3_^1Oěm-f([$L*_q }1Vycz}7G)8J֪/?td)l1 JX=>,@sI 3[Qk~1f,e_ҦܳIS ygubj b$b${W@s$W!O{Lg?E ==}%0:H?ʒxeyb@L ҩm+U_] U9!yUuI6PW)t-7꜄r"߿_G̨VHN,AP"Ƅ qaE@}J7E9xIF30I AaB=1F: olS9/4e6(i^wcbA͏&Xg;E T 1%,jo",{m=ȸ.n8$#c5bW#;Ϯ8Wޞ,&|{jd|k˱d]':BMA9F&邋͵1H ␱C&GAB+ߩۤ4˻T#gjf3O~G \wxyL(%(1e#@ES,h~:M<P--4.>W]<-3q&t9(fƴ?Zƿ|}5?J6rԮm\;Z*1h[?f6->-2U("3:05c\.2oK HiM/:*B0AJ2 # %-Cd^1^_m͏YJ ZV9:[ Ca7wr' ?C`8Mkԍbӕ!Tcb $Ic@zeRA ӗjAA$WAGDMҀ55,%&I2;YDA@*{ x1pjjΪ99P2t/:]r96!!05 bo*%]A`$%X^"G$Cp`(v]G.ȸRB*+e.Fv]{F? rMQIS Df$P-(17dT: )2v(%MzF7ísž1$:/W;+dj:<:FԄ Z_E4[M,4c@yU)\V+lE+l >7Szk)*.dS߯~J9o} ٭N 846Ll?:3%4^!we8A>ҩ5FiTeݭ.htٽx9!chVϼ8XUmZg={@Ұ5 ѻ5:c61Q%SFu˚ n%_YF[-;m{>J\TK5LPg:>v_y"K=gy}pkHl}ɕ ]v|y |DlYcvB:>:|2$ g Pq: Z#C咇<[!o@>.p[@J~(/'fD3N9pט)T|6`rܨeȸ.fO916նdg4J<^W,!}׷Kx/j\|=cm )gf -ɨiZ 1͈ 2#6)_ & JXF>-/ TypR<U矴M]Nag9퓺a5mަF<`OOhC|kͶD_O/Ai`I`4wz06c̲rt:7o-쟤X^ۭBsAw ÞG^a-ّAm:gz8Ҁ#lv`@QAW "}ZnnɮT@^\bV0[RR'w5co~Qey[vN\rWoܷ7cYq\YPhT>@T0oߦo$tiDS6&$L!*2@dN~%a!fXݠ2&Wջn\d\ϱ3tF.<b\[٬,J[vu&:aPJGq$KTO;ʸ_Z٪jQhi5#L*I"j4}4`C_QcsĹ?#X4.*^+g2O#=N^l=X[0T[8;^VH+ d \[\.ƒgInSزg _fg+OE>,U,q-MlD>Qn|wzݏ2ϻ n].ư ՄQ$eߪ^ֵCO1iA_I ) <*X-;&4r<:C*k`pre*&>Y⻩]я9߬]zt901p=AẙOP.4_SUᚖk;]t ,-QS(r-c>tԶ;K [ AdTb8ΙK:^pNo*l>DpWs#oxU,+S +s*vs;E}̴拼N1.h# ©w 'ZmA8IĽXaM*6p闯 Uu<_NBEcwX^KKDD``Osh‘@Z؂}^)NA88_|tPv=d[Z+plҗ!iR+dpx12(!ҨX+ 6kpB3šXv=.R+QX(ks!V`I?WfpgЇiY;aA$fwRj) KX2Eq6A~W.^kC;:HJVh ?I[+,MO:gZ%|R%2n.i2UZHlwKF'㵬 I T[c:3X֔vS\mDl=u^}9}6K:ըd88ޫ nwԿmX]HJz}Hr}k-(bo&Ky 8;<&}k|K[%dhdzi%DF/ 712/ ᖜ"S WsG01( szW9MBK,*X cӻoo`?m 2O#8R$HŒ۬j_ۤ,Iڿ&G^GH?,r38c@p6f3T['ѽ5h6oŬQ iy)jk],:@PѮPK%[ȸBbewd}FW(JSqpjLNcM}lEѤI{4bH%iu 2=q$SmJ]+@Ta #[e %Kʓ6ʧD꜉E6A-!؃=Yw՟>{ĖUMڵų> %2n`嗹xEJZ#NZ5a,o`CD] R񹐀׌];Q|xg䣿R)B0*˲Eq+q/K.3XQ*9 Eb%.rOQe~?P4#@#H]Ȯ# =+ _,2BPmu8:C*$Ghʗ˱`[PcɮY% >}%2zI|фM^T- Qoՙ%}Z\gޠ-Mb=Y_@ZčŴCtl\>rYdFr}30^'; H a眸*JSع  pf&$M"$Qɾpw;#X#`s]z8u^f|=rjpZqԦji^;i.!UK_muw]4Hlҕܷ׿gBRTjRkɴF!HdTdnr8DXE0 ~X_a,%S9а-DC$P3B}s/ƗWhҭ lQ-%2y;73>{jkDFhw_OiJD)$܏~TztJdWtY]ZI.40U*Y_!\+ZfzG [f=+ܰ~Ά6Ih = !9I:/V <7@f$U.. >ci'_t2[4oq_Q þ:̣'/mξƃ880,&k?d `2Nr[C>Lel%=?Z#q7D`9竖6 j/Y,I)֍Ar@bB'aoZ^M8'DA5휽X&P:F\ʦCzVR7bBWC4n@[ iGwG2jhnz`9 $?SكD qY)D7|ZakV[bjQkX_0H#pG`d0˼d>{.xyT"L7V)|cB ܼgO?2R M>?ql64ȿN/wKf>P__k/u k4{Ѻ hVM x'Y|y67ӊ[4v }ux9`8,`.Ou5F8:zf:0m#y(ޖ[[pT\\@ B>IO!h:bbF5h=j}b/?ʟtLr8/h=I;D!}T:\Pt,`ɦ^)Y+^e\g4v}̩g/3@t%LGor::܈j:A2!vZ1S^1νCs Bv Z P#O"1~QnDG …+`dISVF,Eq@pL@2 #%.gnR;p^XfShMƼFƮdIz9:(㕀EGÚ7( 52Щm1/ 3ma[ŽxfHe0 S .`]Y?oLo/x:&'%p[&V 񅀳L`9rZ|8P4m5 :*4X qW@VPLh :Dj0B#nR䬂Nc"F+"6X Qd9̒ MΝz4jz<,-&ɸXO=Ub8苳*.b,pC!P(YMbTdmU[T094}[Qکoſ Z϶j3K U_%ۓbmY;m4[QKA5G11 fSL`tPn/}ه&VYN}giB0m-x2 :}mGLu/N|b%2.T\Nux`lUդC_}>S 8&1#(f=%n'axS)4~7g812r)Z,6Q)#N*ʹ  0LFO1܅4F/s5ުX*>eCEUxs^͚&ju)^G-mH O^aLJ Y|B夗)o-m&¶B27\ Y.1C*|  {N7EFٴRjAc(W1V(kL~ͦCБ.rAuvzX>BN?-qs)ǭu/Rc;(x{!i:Tؗ UkpF=slZ63goDǦZ/m5 G~G]6Uixmk"u'i9沼R̜bٲQqX}Q: {j PWb)o"\#edB\۴hk*'"WnQQZط`!޷F+ ɷ̐|3a` v>#;"6U j c/^cRri(5XpkPX&pɤM\ndB0$Sbt^y<]:(xX3L><01ƗQ&IC~y:fiƍKQ[ i^\Q~AS׿<'B5 *?򖝠 CSlba XPI#1*H[|w3HGr? 4LH`4drMTo,sj~ œՆg;` N-&U dg,2]2UƱp HQ<}Բ%s$P~}B W: Ih,No9)jBو߾&|'f69=s<Mӟ@DCx[9|0##F0NK"ib0\h4NH0c0lR#Dٲ2Ɯ)QqLS)QcS62W~6c4jr^ r'sȾy]\вwWEl3  LMQ̑R)8Yd͑$ t~vk#c>}<64wf}J!:GPMJ|'stM_sōiJo`CR=10*fT\*Hs!"Hd(OR:i%}#@Q>C8WӒ@[훔w$M t2D0M,avsoY0ݥ*ʹOTQ-NG v,c`)`\ˤ&@rRH0@r,xRWdGW6؛:M.-}HfEhdz|y5%4[N۱:5 G?{WGFP{àyAd[!rooP֑*YlaؖR.&|y| 8D+.em l%1'qrwzLQr|: 1jFńt $"G_a=l8UVE~bkӆԊ1n0^ēwG(bWgَo혥?’yaNs/Y.o~_.1Պ1hymnݹiMQk&w(7C]Get' o,=`Q4r2B )*3eQbo,rog'n)}i @<]+gp: }Q>ŋ||_=P]m+8DK>LI]@ɗ&C5xA E.ל1]@>{NRN j1C6Xr(@b$[/\1>_@P ^]{&xho@{^ċ{j\吕z5 xU<扠 \,7 &duytFSN.H`6pt5Vd{ G'_u5&yd%d` 1&3!t~.O7؊1RchEȗ;@ sX=@nd*lw +fpX}kO4Wl0iQ)T=W輴5.Q+rY*fnӢf <=W~Pf _rj"rM/<E*ƙo=9z7an޼xۭg~m"fa ]hlᄑihA.1Q!gֶbylߢk:$7B̬kԡUPxϸǗJZ#&zL c|E7@q}c/rD Aٶ@.'C7fK5A ad"#H0ʶ3XQJhtmcm{SSpS00COkd5b_uj pm&eܾ{:]F +xX 4;訝:;\eF ygq=ƭ;i3) ,Z??wlR ^z=nl WajgaO>zʇp44fpI'|_{lt7ō7^Ƌaq㡓1Bn)D-vr(5QҪ!r.4'd叒 k&Dh'ԁ&9>%Ũa&i&z)@oGyFD4oWf}r \X5:akR]3qb;X"8etN].Q|Wg[[ }2{S%j]x.xYXF;܃F6'Ü%l=, YtC[`f*+ev4"'H˱\L *'LM!sg݃`o%?BD;YSy}2sCω}Nsf[S^7hjpVWY2 I/𔓳l2a\b4F,lF!x2 q!NU؟S*:ƻ5z`[,&,yZEId bʅ@YW W2 r05d3꒓I@gc S;$ p&䫹P-'j1_˓$;PA?B#Cگɭ5Z~l_Д-V<پ[?G1)۾ْq[p{woK6q. ;[ΖCMzn-tp1Dpu+sc yI|{ /zE=CS] (4(Q dCI{ ioq:Y}v6Sd:@cͅ,Z^A6I6;٤!(&wkȪyo`e%ԜS%t&X?zwLh:G ~.Ѳ ^VV-jCe:^w1|j5hˁSڪL@qƕ% 5庶~?,A_ip&5]tSUOMWlO~nXʯySdj27f=M!E"ö4s-4EN^B2F-)p $`!HoPhɉhR񶎩MbSpq#7Mtb[TnSCؘ7Վ{ ]V, ezcKRݖ-Iu[ڒteLCݞLPt*AS钄]3{zM*" ʧE9#a%{S2@ ucL1"&\Av sk.1I1 ʭj;ݠ}Bh"U?B "bHhl 7AQqbd25ޱKAjdmUB_AO9e|_stQxXުX&Y*%pzGKզU1]Xj•h#CޕV4a$[BE?AfGX|mxJUhu^۔bYl7֠d8siQ+Zj2GxD+"r:M75(YBcS,dBoۓwQD  !3 qIȦD1TL [(gt E NnTZΚ=DXZ)@؄J6 *. *C=0@QG,:|sֺROfBw{_ZO`ȣ2Dd(t&?]r_hhoOHt›ˍ3-_J¼MzIsW prBlD:eڈPCo? JIۓbq^R Dec_~(eIE琒0)EtWTK@{M(aR 6uG !kVr9%Ӹ &+e0{mQ;{ⱜUm'Uxcl]R+aN\sN(7PNUHP1&"O.X#l&HYQ)2vlZ5bqIW6n{=F#D_@;]2J/7 o=?=Y/^^c/1Ycp'{jokG7˟\_Ek*+r0Moq+[1+UjrM׸߷_;Tϻ%ֲKtZ'"؟ &\nq}zBe qٙyV='R s"ǀ,I pVd]L{ QޭFŪCGW;_C%, 9PlLvVhcJ(X*D˹f%Yf*X ]Dr&evӘLl@wLg%tE6@f;E@~%HYim#Т\yWv+lJQX BfA^s(B Eg[I,0-W/˛cUw[xǣT)DmIwa}_7O.u)ZDjvqr =nn5J-ư$YwQt(@'t0%0א-7-x Vl4`' oENպP]c3VyrB40\=\v@:m_> >%UM/t( YWR[nBl4TGRZ"&k$Cy/Cs)՜Dz!J"J^7|켹i!z y[:M [ȏԦv;&K1mٛo(5Tˣ3eP-E=Jn*htG0GLN"2vyZxۘS )$[y0|reZA_ >nFJ-o%ܪlVJر,;l'!Q2E!RdkۑIp8Ni}C\w554 Q1'$3$XZ ٷ%m-8Z7&m1-a.㱇ƘK>*L4dxfr,χ?/QGy'LLnT>B i`\`N3:c7a}BҰU߸a\N9I՛I]R[B,y|-6[1ڬ]Sh@KL:/!K.A (ir¹_4/?qC|? N/}m^0r#j8:/%_ݎ;2xf'6&IQD>DDczbȸR-pR"A٦74J[Mh!PcRVJD3h/?1vl# x߽oO{׋&Q O<"Xv[m*:΃Ƥ.` ÷ `i9s $g~0Rn @}v gY8]`Go?U}Ӌ|D`kCx4+'Xs螠|N6\.ey9lXRѢ\sq/e ;*7Q($dG2yYkPKkH5z6Suդ=[YG[FԁLshõ|Zq#6JkXZl3 67Ģ{̩RمbuVaO*}fj嘨 jJkh(==8% W* KAb_Lm3m\}:yn姗˿vRizAe2j}GerY7^~᫗ڍV/ v1Cj^.wAc 6:^<&׈[|J!M]@ȹanBhLyS{F* -CV2`v:ĘE%:ŶXL.C9V.sƤг^uY[ĔfzERs`m*3<Ԟڦ GŢJW7eE1~ 1GWKm<uo߽4Z¼FRT @ ٵڌ/vvᄘdSQ hW$kxL :ךzTqhe8ۣՒX AIbI{MVMSf!hעhMec!y,~Yn167^|f4׸ܬcx>ݺj3'縟?fs:9iڎûHͳK'}/y?^O _=`E૒eRԡZo.?"T.L`A%`@`SkMVrf\urzUy7%ߛ9&s+ es)mL2_[/sM<éH/5-S:LمXRݷ>OJ0i칤N{:Y#9/~w_i덁.~-/^o!xX@Rvp.W52JCk1ՉGBޖs1@>b&3|9iU哐%v&gKC@5Q6 -傒,c˩[po\C:R Q1$9pQj۪Qj$ym]PHz reV:/9vƆՑ  ֏L1Cs؏Ƀȷ ϰ^geV̙cYutO噍mC/ 1C/ 1Yrae)<``Zg"yYB!xS*jB}/iyL5ؠSPk XH)sy3{Fob90cZK]7ZYt4ߜx6ݎiW>Ls}WDw|pUEEv6J!}5F?u]pI>1k~_^ 7ewZM9Y=9,qrNzo%\a:b.1?utuw:_7ֽOM}:/niI~NSow+SwC9XO~hLܺa,U|{u\qJc4`3 /;厱EgAvy}bp]jrkZR<@Kq'clSB8}s]oo."ް gwV67.km{WG]!/#u|pf_ɒ쩺^D tBĂ?K3L'>9M)0i[v7mXo=SO7vN2ZC{φK\SC k%>-ďeMLeMjd'%;fI/kcKEyc=n0PRX ln:uФ v}|1Nu7Hy$ sHG~osDfJ}0~p.7n|5|[gFb!y,W8Rk`F|%yש]zMJߴju;69{kK ' ap:$X{2?Pd%gH~Rgu0biܼd>6d5Ew?=n|5vQ:R@뻃ݾ'Kn_ 4r]_*w)bC}?_u{W /7mJջ]1vmɑ1~1,E0dI_֧g`kS)k/m|c ~!_8IeVk'з"8yiA I]Hn}oR{v^i.W7`N e(猲%$/KdDv#lE8W}b nC$MFn v#"v$g F9T˃;[n FM#Y|遝8kTƞ@`'A.`5l 2-}==zӿW-Vڼju=z,S߬pV^bjx_E=-m|χ?r/??TZwWk߼y5㫢h̯Vw3gSym;0lkŽ?lv,;BR Y]v4/(sl-]6Q1}R5 &L v\*iT1yԤԚ1VSmVgsli{m0 .١_{<3$I Yɍwi4 x5麭5A1=h5 pÍ}!|L8,gk~GDz**nT $gQf?BJXٌ>u!^զ--KߏR oOYy缮E^ yR#m-qV)gZz@i貟;!֢N1$&9(WZUɥSR*B׮ADC {5Jb|QL$9;J-7EdOsUHI-JwWx>ȧ薠}Y# 6[2R6\5L^ua1{2)A:3S؋!IK Ĩe:65@3Bv1 ZO~BrO3'# 8z;ҋP1%C2*U5z (C%JlIIZPkr3/U bY ^͑ûG\vHi,m i,k^F=tI&nZbaUSD3<61i~IAW7 0"OR Χ>sZ~&8 j!)u]:z(Y@vdԶSG*)C'ndqx˓hS[*sG̵:'UzBcn}5dmQU1}D7+)Aacʃ;{(=l>hQN-d+ gNr4 բ'"*fcÏ5ɩxm΃w#Xyv0IW~߲Iz甔T&WG&$TN{*D.6;f'1\s|J$lq61SAyƗqo -l=GwXC}$Ncmό`ӪNOyb]о%^:u', ?eߕV迦 Yբ޹z0^*9LJ^;ȞsH[k@-%N]i"$mR礝6|01zj bS~ZPOD[ñ&,$^^iPr4a@Je*jՕ2;=ӐU1%1az&].]kJSp΁r^jڮ)y\-zvq5/;Bxxcj7w0pp5Sub2U5 !#驇]e.W h-jCY h-T#y$Pq<"&;rX=|<}z> {:~ƑL?}p8^ZA@>,=VQE$^zc#)wt{cIGT$uK 02?S=s1 w^r~X $7^k_w_u_մvOmØMVּ ;/TGËIn>.QFG|Seōmާ v޻d "-mx@E*JDR*qTR7cLJq|rW 5HfKS*U{R*s DݣNGgk˞5/ 1*X'N#ȇ'Jr♬neO6mt+鹳Z.`r(~'K@r^ ru:v^G̮_(iHookvn50Z J?(!LiDu}{UDfo1$D29T諲>RCVR0{y*ǥaЯ0hc<9^(oNi,ٚ{38( u{׀-晭2RPp_GRr%Q]Ujukj$\BԿLKSv Uzu͟Ӱwhu=uЛrIޒTt>k//A m\vg #vb镂ڈkEϡcʶ;G3JץǺy<ڼj36CI!/"3i( 8JOy{P4Bξ6ҭ< ҌM #݊n5 ֣ZJ f%VnqQ@Jѕ#::&uZ gaN<8GQooN 2E(U~i@*l2*B`eT9Ckb46ѣVAj^h|lőhōEiP?ͥ ݇Y2%t]?k߆z:~\jz.M,E1[T]dJ ư- Ђ"Sb16]i%Z3Ѫ)i99/$ywG3?=vR\@&RrgCDUTW1UiRrɗ,fʮ<)֨rd͕GF{XYeͭ8O|qTymdAX"=)dn1 rEOS7L|ro;nj jz.k2n:?x2c˟Ͼ(:xtEǯ]H6 ߹B-sOX~^1XDv %)P,J\MW9lo[ۧa f\=xXJ3Iin2QN ͽ*$jxclh)XJ3&pq4z=7Ȗlz waUi*?>ŽǛ5/ #Bndzx+gq;nA9$RE5#W𶟙ύ= [2j|Mr-v%6W#\M(VJɣ5 :} :֎=x:Yh08ir2ҩ&f18OGpTM&VV؅9:)*CޜR+GL+ʂ}5J5Jw_~ݏCkԵ'I9Q[瞁Fj(};cuIe]`x+ɩg B833vHpKnEpqf7 %μę'훲K׌36)!y+:\Ky#1Þ[9zg;9 Ѹb"1#7|&:~G GPC*5O"ȸ͚ÕZτࣺ+sN=#Ig!wXa|i=cB89e=ȯN8uk"4؊IG1 ΅pn7}% cЗqvw\P8p f.Db|h@I~qh7xq(f?ߩD痱߾WY6ӫO3C7{}/ݧ,+.ҍk6|0,L> הӗMVS{.;Z=z$,VkP!TϛP!ɏz&iG={c/%ϓ=`UF.ʦȀza { h sJnvRA7'@{ [ݤ}~qh.&M&h6TaL[K{vLW=KRsv`vw] hGAEh@iyv8w9&hGն!M:"JJe:ʔ8խŖBQϖ/CDDfX/VmL5cT&//lcJ DGΏps`ceS"WHS)֪^etln0\Bcl8R1_āW ;ff?I dH)u+RqULW(Me1_(9e4ձi` ~gؕ_:%&̒oo+ŜG0O% ɻ49 pxX#:j1yMfiHY)}3sb[hAi/\g7c~3edY1y X(uX,Z9UʒA[F( o lǜy 5yNSIs-y1.3{ )fQ=J}qlՀNQs1f CF<4&C[`!&$(mdrb('p3x9VgbYPE& !A1$it̥+Xmxс1s>QT{PG%`y*c4ݓD{y@|Np! Z>Ŝ b:[SpTiiZKɍ\l;:52r2[|JXkKMl;MPBt`gB〈R˨BӥLÇ3>[E[Vٓ&0?%fad@d)YȚ4zNBɱrVXyV OB♅Qb2T0R mk^&%V~HԻ3 dPf~}]e[=ΩE4lL+*7mk䷒SIl͛ȕ,%05&B܌ ? lhyibʼnB-8vPhq2*<(t#4$d.ljɜgcy&\s a 3;0[J^0lSe*⪎ F,l08oc9NC0",GpC5.!3b9 @m-2hcĩжc5l99y|(}諛s\FQ}~ p=^|Zԥco 1)"b ` CЏ`̦eGFcOPGcJ!IR рƪ\19p2~2rwkG!nW9QSD%D,S_˵7aƦDТ>T Wt(MP)LD]E2ݼ?:LW!LA@-8d3gT?:EWJ1ѬmʹԩZ:n45\*2$)iR!d%aGcL%`-&!ST͒:1店sjrRD ZtxwUL`эT3tfvͦ+@rƦtܚdB{L,eb_L BqbR>$HsMa^6 ؏N}z;Gѩ;'<.:)&6Ç񱣨nL >v5o("Y.Jp1vղ%g)(j^][sp% ɟ#UHtN8z%=!9oؚ|ǔ,3Jy,zU,}cvO9;:e'?K+y,;NˣGm}+T*m-_ &lv]< [[8qޤ5tyt(lgL :E+;<sN^ma" s\9nżDީ<$S;FpLԎb9 k|wj@uwj3EOQ \t&wjש,rSI횮SwL.Z-<l;;Rj|N:;jgKw:6ahsW@YN ۂN^;CpN䘨Z5SlRN>"+p ^ Ug *ub7oWFzκ@~̋,>xühpWn1*tjw uj3,?T[=gש<)Hcکu#c<&jFm@OvjbtP; } AP!uvbp;nЁS;FV8&jb 0 S 'Q5B;  f3Y`ʖZuN:7uSWNCpGEB;Zg0|Zcuj8)\^r+=2dد2dث2N~ze'^P"z!W y[B8dG5:ۓvE +0GQ|S NکaR]p1:97Y9:93}jvϷ38zCvܷ"Ukq0;Q/ԩݣ*ڽrjgD^.>)a]vP Cv@.>!sѬB4rm떘ɴ3f͂ ~kgvAXQ1ĮǑ50HBLI9S Ԅ M:7Rb{_;#eщ{jz"e5N펔1if]-rWm7Ϗ])h kW_\;'[_\~rq~8=%Wꄅc)CnqݍqI(++~ \>o=bx.y(_= Wןye0/n|_bg;.kv7鐼sl~˷׭yoɹtHptb/VjXO5;/p1^:d7n@ps(:ٰWm2{a]vC~&O*C-9sOv{xJp8]j6,jyJ4N FtҮKg-:pQtEŗ+/?ڒIfI7~CmON.Ay34:bgigC3Fc4=pm|iLΓNR4Q$SogiY+(nrkގA$~&=BNvoɹ{rϲs^ڵLJ<>xkClD@yo3[M[AyD#R3T`8oiѽ ?RbûɞmMw׼oRzd4ٛ-9w-d>ޤ.Y}B@:iB ڹkb8EgAx 3emt%jӤ4|8w 7OEhnA*3DOQEٙݜ\yOkbAm@6M_vI9̜ w{KݓPc@$wȋ][.=LKQbI69ُKގ D8VK"w]qG.=i2u?ZV~-agL,5gYNW_0ē9 _]E|,tR{Nqt'hY0.ŻKSS=z߫]ogT]^ծw|v$ۅ(/,?.äH:Y}qmâh m7aڞ8w[<հN~=#oÅur?:Eq#מM{}?kQCvg`a9|5^tH\ʯ,2hCX/6 12w׼DRiva;w'gehKogXaγҖ阛nTcg%Z/Yk,1>z|;@_ Cbi"Y\:uL[3;-׼Z6lNauْ3$^Ϸl[#'lKVl7Ŭ ktև[`B?f+vcj{ȭ_nt~!H[E qȒ#y٢J|yI͆#7PJS2p*nUuǙ)+ l }I2{.Qn.`ݹ s *SL L펜!1@gM R*Wz.М#FS Syj_AS U g,f[ZvN7VN˴!s[HTޚ4V%;gjv֞ywx|o/_`38gRIaw {3[{e?u=M(w3~\W욦,p"n DEO,B0&lld:T+TDe?@5~- ц d%<5;3ٙmTg oWW n !a0yHI$c@mJt SLWp yR޵֨z$1|\kf92{RnԍhvCMrƚylQ*)ohM.=Ŭ]3hW]Tt9s.EUGֻ87F!렭C>҉mBq*+ >8c*:2IS5!eN@\"Rl~ޅm4f!7%eÝHA%\۟xًq6h-ͪIFX|qQH@pV_ywQ!BxTwؠ ) Z0?]>zypS?yf?ȪiQ^6Nx+(gF%7>2f( ťwu`Kfe]hcȗEVd m%|k[IJG)CATԦΘhbɸ* _hHE1STi pPwEַBiMV(Gͺ#%E~b%'.y|7pDlidQ*SU 8q!*mN$4efZ (0z!,_(LICTs@8!PZ &*@]kWhd"OBpAbsʑIJ-Jjg-S`vᐩEsOծ#M.mҌjBHzע]+9nhH )@isnjZ%%±PV'.M})-mߌןz>>.5[3IJjT]}>M+*ٮ'$7)5O-@pqm:;vxbKIE.klݪ\i)`UcݪWbF|I\.ɌL%])=d aSu%Y$7X%m+Pj]n$HLk$Z5mT FbX=WdS=vJ&P&ESgG9tCgTQŻ[SO7Fq7gS)zG S.)wB.TIvESy.(,MiZ5$CSejNL힒ڡ:tNuY9šl̂4SLJݑS;d2Svg JZC;m2ێ|\rdzPXfvwQΦ5$_Xfvgfwh=%f8cnQ-u4}B;&=jc>3(` *aYvPc%?}۠7-$m@i=9LOhcz]Jd",*ˇ"Sk[xf3PC_|&'aJ'#J޵]JM^\R%2qZQ۲hh'vwS1KjrHYf2+%-VNf2|p$Bq ).`c DFVˡ)ahm<rs1A$j$ wjQ!SvbJd$ uf}|hNIχy*59g;Q^Q(SXk1F+`CsqʙmY"Ғ# z %[ERD! )3,<0qatI#;pjR˼  ^1!]6BD,bGn5SŐ*j-<Pml݇|]њj{ck6>5FuUѝց,Kv`Mir=@PDGjcfC7a|ٶ;N_m橙45>V(ǚʸgT(+QcTCǑ*]Oz8>kէ -]\\qb6Kkonn5j'l{(e/!-ð|=FJɚ%\|\]MPkw>Osq\JS2yb`5b6CgeיiNmP&ZCS?Se&6L0UTLu.RCM $L53Z|STZTQҊSbg hAb sZ&"_XWo4p^7v+.uSASOHFLzuez^GoWNƗqq_$[1pd2djF}F.↘5`ꙧL T˗D]Mnjl᠌G3lU.総hg{͵8i5n1ή&0$\^ mkIt6&b uh&2U?dP~{z!>,3q>?v^+BPg q|灰 "*0ka"&o5#=Q; CI\37VnhLruAbZҚx6M 5s,npT}ˀւ(z;f[T[Ieh2q"'SuJΪ/mK JIT& !i 9XkiHyetSL|U-26(/>}bfBe&j]Ԇ}=ӸžY}Uf;%|;dXvy~?XVM.\s8 f7'2p7eWw[}Wr880#`8UROּ9|.2>IO{j !Q.HWz?>n߯C׳Yaӽy;Q損Tv8V1FfIy2RRĂCیCd I-%m>Q; BˇvW V48淌r@]u=~{B~P[\ICx𹭓j9|I˨cfQ]1FGvl%62/p$r5ĢTբ&K΍d4gzPC+l{eg=F 0曫@R|7Btt\c$Jc̤UT:^Vr9 )e,)BzDTP , %'-)JN Sٷf8oP- GwճFP"?w<&{Q$ct.OD& '&_K}wIGU[-+EH-ޒ>dMҚa2suc%;  dPlorMfW,ߒ[PB=[=)UdW4[9qs ,qc ߯2^4Yjps 2u`\xx46D;0ޣ&U3F.+n3oI!n0ϳ|zyʧxΗ(~U`5o(SRfWgULcPPάTUqӤ>v&U=͎} w^gݯCV.;x;_yǀܔ&">h\l>zՏ7k͝!T !tYF#KnpYH LU)SES3Md$K #F'`eվxȋIçQ'F(:INM'VijcklVH)rEA}K4_=j"k`}׶( zVLvDgGt1PTm2bıtQRJk%XM+Sn2|aa=KE9 $%;@QKp⥓񜎠7ڵ좢ځ Sup%nb+hYZ[40% GTx(MxSDv]h,gֹQe +`D'{KCЖW! \1ΰ`b4H]BYFJ.XBRzi ڔV{<>.[pi‚49Tq-SOwo2N(+{6, /05fFLEN.R| U7E5/M"z{ 37mwo~z4-=|~\`?E%gsy|*88J^_\0>oq=fzd=3\[[N])՚6()d5Rث|j!B W hº ܃D,)Q+^d0V4ci (JP!VVqŐ7@#ccר}#@mc5۷DИKU=0v\(8y;_o2 EolKp=qGE7[R" |Ub"uMIR 5X$dE<+bw]){/[6XJG{>P* 0|G,T˹'3Cj6X΋bK^sTH,cy|Sy:.)>q 4ƛ|CFoڷIZ^:#8Os{z18OM6$uټWvEjE<||<y ?ku?GC12W2~vuq8ܶҕ0AÈ ląj4R0ƥ̾XA5i083g2B:4 \#G+8m S:t$B1sB#0'a&HaY! 0Q"yLaЈ9)fSCO3"yY&⃹xK]4H%OT_0VaJ]e+C$>B-J^(WRF4"3YLB9w^\^kn/mTWܨ>AnET,lEJ-();&|.qmR1tjS!Ju\MJH N4XSx1P0XBu^hLV∃%wR! 0Ykɐ1|IH:_+d%@-;{kȩ Sc6 ' \E@E+Q2Hd:80WRL$3pux8pU rT>4iL Xe{02bAh݃'(PTr*Hp>D Ĭ$^NV\*ahS` ItVGpZ?_pd77ͼqjyi5Vv2*FV8aT<ljk^5# A#,nHe/,|]})U_+AhM^7eT4|vx:#^7T_usbl?ɛn/חk\fW=>YO'.>wgub dڄZSY|<;zӯ<: #S ћ)}rrdj{R2й$/#ESNG&0Q)Ey`؁W[ta0`dTDtۋkk.>>^ v A }Rݓ_Z@EvÍP}ǯ5Z;cFh X6ؔg{o gxMzٹkvvn~!Г]M?mm"vBQ5kPc'iP AK 1e8E%}lep\}A^7T.qw ן$M[6VgڦIV3W`A˷8# Q&lh6xd5 ;p3iڟa*Lw#1q,ZA,śrgfF[W$qчǧ>L=jf+G{TAF҃O+-Fpϰ9t7祔RK. oE]H3ᓅ|5?1"z{~.[cLH^ Bzd۫K'5)5h1XjRn^XinKO5xzDS>n#޶z ҼV_z-${wWKE {!僌es1 Hg4`bS2\Ɇ]l^PSt I}zV}T .%cyPC$__mnxo }w *᠉۝ցNÛНDCHi˝A@趖+yŲSapbdP>IM`&1=X0C([mDZF$6Om$ÔçE#B/G+Aw1hшZY`ˬKl֥n4(2̺ v ?5㊷QSG\.hp 䄈;i-qLj q">*}mNx-p q#-!jjgT&kf6X8S 9^2ARP8eqيr Xk# a =&~T i|{lh^X."0,ff *WB06j(ʋue[{1jíJ\D8UUUBw%ntbzsU4w|jkA[ s0OAT3mV=cBڦ\%{ߖı7"cXirSdJAM"=(0kƕlvJf@" G(@DRMR3KcJ'546cX{`YW(^< YlVV4@ހfju8I,"kOQN3jj%J3Sn R?mO-@Gb'`*,5`( TP5(R #eMЈ 3|fZ9<.f y<ŝCv-E)ժnY 5NkaQ=02OzD>.\EHsY%ڄd>_??F?x`S)ʔTgs1ux2o."w @YRTTOF 'a<'*1]Uf gk ;lkUAZi alWpECBj5Cj'mHU݂*oM!/8S#! [DJCi+ LvrkhA\`vhUj ECD 5 ()F d(gdݘwY~YҪዛ|nM?_O UׄG6n3 4àZJh+*T3M +gGBr UH!D[4My5pEmzwX0Ru wսєyiǿ}]fDz)hN@4)Qn=fu̞);w?~2 }b.6fm$zj&&WqWk`2Jw`UO?v|gT2id0ܲ9 NtjAD;5^UN)/NmО po߂_^+apv|-DsCRn>ghN0Em%Q=%YTJ%Z.$DGFey_)[Ƚ4j{Vc HoL~JcRmG}n ca>7vb:ĂLj3NocC6BaB(*nWzTЦc;xseDtrr-,*.zJm;׬]J1E 6n xzT\/w9xp6%F Y yⳚ\bhz_," 2ITO6M` .洐DBB8)G)ohx(}\A]e`9=+.1'loNoVϥY ) (d;96Y Q oͲq([}@Mח791du<|SEKB)?E|d 0=]D^k1Lto R5paU#yb>gʷ+ha3S6EXl-G^޳YNMތ~~ju?ǮQ93NUp &sަ=ޮ:]vd g1?ojt̑{dKo0?܎mڑ7xfܻW $"]+oi}RDxJyЪ"K@qŶ(װl-_ .,׷wn:[fί,') w?>mfzJi*--_-:mx?_/~}?Gs_(c$d}H 3X¤q;YZ0HhL (IDaE8ңNLKsStM_&2 Ǔ'=c-,֒ #زؾ$8fX2zN8\& &RMi?HbvJ5HLAb'TL oUQz'^<@H5u qo{ Jxma9rJ5P: YXn vb!nC63a!AYfC8G;+¨^ B*NZK1"N^7h ,dyqc$@Lu6*S\R<朡(voi <4 "FD3)Q;k1 H]{gTbu1[(ŗ/^I ݻBsY[ *?E&O{U;n6s5j P" !J|0=(IrĦ,X'*BK]jfX 4j8[o3^L@ٛ;`ZeCPˀXT2Z0V;,tNu118GTIcjooZMkA*;DǎcfXF}puMSۺfĢ&彏7I視YT# 6yz 쉝% l=+]eSIB,["D @vjƃ+ a,Fns1[1BBqlH!LH+n~I-^<ݻOtR  o٣;^Q ~0ނ0x,d@py4*׎!bɝ2+ v|~x\O^ywxe9ĈJZ燜B!gt8!c$XOK]k '.AIeufm]U*֚Qf-՗d08;__ζ,FN@dnbΘ#b>(qd%@apa_6عu{gs͹?B(c[^JX)OY+r`G Q"DƇ=G ՘SCcaa5HSCX h6Zi| eJ7k) bBhNH)@F($Q@a̐Y}f?{>1]7t?wCvoN$ٔLf+aWS7#ޙħ]a79^x)8R{|fb-uo9rs bI &9'=*dKjki`t􊤇l?XOfbb{^A۠峵1|a yI&?$ߤڋY`iT/RRcbmcNᩇd$dޔ.W|w/< [lJ-fCWn}Dnޗd֒P*9~tb>W!\W{=Aw@*Dkuԩ!䢯2QY3N`S>|nX'=Ί+2R+" G&{aoS0' $jil겈C1^[1 B[a2!A;"O /X$r +Ulr#?Kt$D~{x{:zQc鞿`2yN҆OHzbo7&/+~4s0_ ?f.|YJf$[| d% e7foP_3<^_ބmo@%9'c}E(N`*ǓU`Je)ꃯ +8D^&rP=d9źmNbGfê:aw<} f01TmGN8 WRZi-+"сEtC,Ł<v?T,ݪ $m3vA- LV^Vޚ\fwlOvxz%5hLJj d U&fZZi T/# B+ )kM .^»O>=~l $R|#RغnGSH!A|ggk]6/alO~_eE{B&?5ǣ h43"0-6/%VP1K|:6p⚄GnEa}jaܧ}j3a=jFJcTCLb?G@erK<{}#d\H+%܁J q]' +G orU-Qm'.} yZi ⦹m;v:ݍ™M0pnK]6Fj遬':C~4zRUc6xFqaFqXE2Bb)X?},2g"ҷ/[O%O]->N?~Ta?)?zW95#8A$Y:D_OEܫ~|OLw3_\lশAow.NIr:."3՗3kSdmv x0o&av"vKJCnip3,p 23ߔ'lGpd؇OZ_( F]eu1)ɻ^eߙQg^lֽDX{M$t^/'훓ZKZe"x>I7TdJ -\]x04"EQ/λߪom9Ōr;CS 3"qd\G %x?3J0R`M2B~?F bALJ[!FH#J ~vᢙ.u@Җ g"vKdC3n$Sܥɗyiu4xyR&ΔcPyvOقuI R[g4I*N4s@jqV HORw>DђYO\MUN a['sTy8v\ɮFHq 9̕vN fi;}+7K ^shl^lPh[x񏺸V s-%"b-&T[/Tф/h;P0"7'׋;U4kHxCBz] KNNA.Nf|ʹz94a۾F CKo;U})04s1"F/\=W­Kl@]$;Β!Ew|J"0|̺I9 : YK4xQ4|wS:y:1WXqS=MQ XXX E+˃s6dd,>e^;yJAޚ}<]*-rXZ3ev9;hZlڦE!)ؼH/JT]`=UWϭ4@Bsyg cJ0Z+jrN]TY6OV&BKeV%^{WStZ}^!e"iL`WOs6}:t5@JM8w~+QJ݌?V}8ڍ EIlKji9 jo-yhh #JvA VNj{7z_s׎ Z-1l/1 x$.U|8&O 1X$`|&%qΜ 17?%J3Ԏ;E*/N聖 uآ׭*gC?7z+z68WLc0i$#wFn!VJWyFWLȹ:O5s6Kڇ5*w߻776.P٫DV3 |/z@F܂3tq~O |>8\܋}?#cYbm,wZS.1F 3lUSXIr$"HJB`I=7IiC[P?]un>e`8-h(YK5Y2zh5Uڭ I6AպCPBf)rquV>_o%|洣r~9Q^Xd6/=ĚkX@ȩ,EKYW/L&,Ci&Cˇ<(gG~Vb rwBm*4;D)1N*ԥ$u<5fkv wYUb{S ]Y713I{ĕ'g1,@*,8H?쒰F-Yt#R>żh[rhE(-}Ɯ(ŕ&w:qĥYM$n"/sG3/2x|z4സ| 띟Xxo<L^O+xneqE f.4g{;n Y_6|٪rQ0X>9Y?)=78}y<)vZ^z1ږ{ Ye_L%|FjuIdIDN,*QXI Z$0!`)bB{P^^'i0%.gb"l0*W+]Ă~<ɧ] YqRl,=K =L>dMaqvjBl@0#A{w6 *;Pr '5gSEYC* IJ\o(TpdBgTTݻhc7(EH"~U&Č&)v6ATaL`3 +Tڢ%ܪ嘧#LCrSIө7BUlv][f֊~0Vbc ʱRBZ ,ۤ\MW㰭\M4\YUk7Ns(HzYs~ #V"eAbdJ00f>li0m(`z~vK0mm䇲A|)l%pZ cP1+Bxi9xSbԥhJ,N 6-!X&-l9d!'~Z.rɖKI_>|B%r^8KRi< ( b󸨤6,Y1 ם%. DCaU*ɈnuLe;02ɖILV.wJZ%j0Mx 6 bP#JJ4-X W%/5d۪lԼtľ(D:;t ՛|k4T ,w7\NA3œa|,?\Tʔ%ק x,.]xu[*)jCdL&ֻD"Ι7jy3b19ì pɐ(mS"ATa*b)%aHXA 7j@#L#~sToy :HcTcKTTK%A{ǐZ2pZzΰ.hTY>vߪ>WrlP,MKjuX(P:l)fDS“TqHr&=J7 Ah$Ʉ$ɘYaBPW3')&і*…)EU0T!l c sz* xRa DQ) Mڥx,h3YQ.froJa _[Su u1jǵO߬f@bJ`SJhޚUܣc| s*4ŏ٭U&,SpId!U`+3DhK4.Q}7,}>̭Vfvq=9BMxAq':?Gh?fS]M3<.XsmQJ̯lvXYm J˃F*Ih?} t~:Y<&&fXFv%9Q좭7J㧗/onT b1BOAlgEC>Lf:>byf3/e -Ti5ibeǨmPV$'8?zZ9~7w6.};o7~6zsL7W{(Ǐ7uwL+Δ\πT/]{W++h"ڦ(VnD-[fLt׿Qr;qxch-66S2[ti-N=0?s4f|-?>R"B$T xlbɴ`(38RAHQ՘DH%\as(+bd&LbRa1.nI|fC9gIb*dt|!V [Xj`C9'LDAPGH=;IK1M%rQ+JcƈVHR$A 0a(3xzB<=, ¼ժ<\7H}y KA@ iM>kCW +G/7)EmtK<%Vѭ.VHB)q+{S0. }ń`QteErIR3=E@Ef#oU1lU>Ku/,7[AEӍa'}u63+ko޿Vԥ~,HbT2(Xn ף%%]\n<-r)9l}#78a r:lv$(Us˷F)i[~ΈH>Ow|~1Y2iR酉0JI $n*45PQjzc§L蓅k#W^#p_;qNFi6a5ދj߉b4.$6^ioȭjpӶ ra%q8vC > :|T_6Z9 SP~ p9(_ "lдqFCq6U) Z R%?`Y8wzް!JJ8Q\j"  S4Ab j1Z$qډ^}dJBqJ֧H! i>&ubY@*R ÛQх$\ |+WlѰ@#z<쨽W~0>F!Vn-q2usC [lGIۻuzrd•mqu%Z}knh?:KE2:TH$0h="̝ .!)o11LP"2ҁIP0CG ! ->}\j0v)1z1!a2H b8!T1`i*c T)P>^rH,<ٱY.75pT,[WqS (U[#FvBD 7&5O}6OL'XPiQj1'$@[}!HmP`a\(XM1*9L{5k"bjbfJ`b'R5S/*H`l"(PD֑#I!|vrʽFc5`}82T]o} !,m3PsdnM*W̭ZM7qUz( ʕ#RI|nAq p63xcv%2Vkmo>aM>Z+h8Zm2 ĪU*%JX-NڭHCY'S{Sc@0@ 0+HAJNJr*PlMnk%T V~4&zbZS\%Y'𧻓V}-A3ƭ22NDP ̟v  R͍J);#ETC DPZ DbIab$ 5Da]%SP5̀:( `'zk(NyƮ4vblNG<%2i*Ũ(┺] FKJbSk(VNJaNb DZWTAQ )j(:i).>9/k{+ĮMcA2y+%y(8eF\ì+Ο#TͭNwkigt3 +4i<{yFf>b硌|ہ]`T˗u_밗:%LSҙn$FH,dnlgIM)f/?5,uys9V 6 g@2׷Xyt 7刨Hn"3 " lx|b}8Z3;r)qU=‹'>"y]O )"HϟIis*-C|IzAы#7>S҄@T6ruSR4Eqzg% <_uLw[j6"Ev<ߖKAHĀ2>B! $Agw:<,K"JVvm,3ZJ8M҆hV= '`30'&.-ЎSX\ pdP&RxYEmfX!%Ux|ZΕ CܹA_=JW9A+ķ?~hf}X"xzp"{\gM"|58L:0֊^Q~ܪ_;gfkD]M~}&pk=>=fBʥG2yGIu@'춦L8(LA䙕.TȳۓHCn;cv)}xg'rI 'ǵ]639pVOlzܿeJ]yL? &LxDQD1U&$\0`w޾OًspMbH) "$2e6ErhBc.|o_@Z*Ω) {> <9 4)+n[>/OtjWu09z-H۠Q>ߍX%1qX&"I0qBcH,RD$:F&!h%CI>$ 2~"XD+0FK}w4g~.D[R^O#grF}P|^]кnK[9nׅŨ(Qm)ڤ`9_?Nx&4gBbωr'AP/9]cBZ- NVYnJ'~v;I&eΨ`}yAfg1IK躩^syҟ(4kZiרd~y,AM:X|>5RVORG3̒(iL iϬ +RphmZshMx|iL)1a2yNNn07eǓ3kid xRvtuJiUq *=lM'yϳP@~_7tO,̏u:?>yb*԰L/>ܳ&˕}s!*xrM,4-+ә;#7KJJ^/~=)ɇ:WwA։=܍ bIM\ahj}oE_h]x>l[jiIȷ1+L/l?лBpz-;B`m%ţ㠢"TE7.>'(ZluQ k%!(\q^P) 1 WP AVA wQA񒰡U/BAO&)8Z((F*(ZRUKW<A %5тhD5v Q!=b/I/SXZ·YE/;, |g-X*NtC &핵i p@JEdDRCix4ZcN[i`DJOZ&ɯ{vHh5ۧGsB<7J9.T+ώP";&Y˝dώFރyYZs'-سS>Be5uύPQBBy mʳcf_u>~v< 鬫:Hؕ= JLlwώ1:,1M?ywgv=TuA=Q AW E4CJFA,`xW*Q n01[ϫ1m?~*ekjެF\WMpy3nS5FueepJ5M(u<9*"Q[ J%AHC'?wJ5j,L[Qp>8BaqRzpA] N0;1;7jx~6yUjK->7g7 B9<.+K{wi%t&#*~!gpP.H1F6&`2O\{`9+>JeJn3;L^4_e|{H`m:}yH'h+j\y 1PxPCaf(ETcF#"8 0{nt6A3<$KX㴲^Ɉ1r e +dϢب D2o zWll~щWT0tKrjJg)eܢBUH"ڔ䔭0HG[go' 3z'CqP2ClǞ.dPHw^hbz1KfVVZ擋y˗N.ȅ*Zl&?ܱgί;e nvvu;{NVd6lvuA9t )Dq.ٻϿ=ϝMfj|Qmz.RC 9T\y/7~]0hG?<޻ w[mQC=#1,r;J_eYcQ{eS(J!BPl32H> aBh-ٺ3^rЍ#]ŭϻRࡘ7]UMJyWEޕp-Ÿr9fc λcZAqjRSKrIyZ |;]Faǜx{ }̕D;{]u_ ltXyiW@i+87G P^5YzSj-A $9Jv|+-WwuJccO*E`.x6c\j ʨSZ{ ׼I,3.سVA±u=oBy|dIާ[$6(>GC!G$(Ƿ\20F5V;>[ I'@C;C0&ʙю{e)|T7#of\+%E8l^Ѿ -d:KcVUO|\sOCn&xx–x#ZёB` ,; xƑE &Х(&=IfV°;9`Nz”b0,=1/*2* Wq>. c,JWZSa̢:*83sRdW09" &j,:wq ,&@tmgY`Hܪ5P>XۋyFӖ`4ޟmxrQ<lPSC `sM{=D} z?y. F}/GAW @Vo1W4dXΛʠ%@i7`Gn%Xm؁dFhZ1jެwfK> U;A0&pȊ}N|'gB;jwd.E9Y; d^)i%"WBb4mMB&ئS|^ UJ_}<=|\~W˔pјjnO'}73wVg$Gɣ4X"=tzu_^M"L^Q:DZd÷'*.?5$$Ӄj,4`wmOVVS,rc&ڑ?]?5TwZm`aV6뷻/>qPV]_{Lm:C3uypgl?7gS2Нf} P ?&w!'18}5曫yق-9_]\\,?۷Ӌ7P|_{vgte?ishҪ/,w/s!B](G\Z%V``t8(W\%qsePzrQu~N}̅w)vqf?w0{a5`= CVWx*>6<γh5܋6];k/VkܝlF9g0Ɨ^w߾W}? m2itni%Y?Y'hۅpbXo@}OCzfGce˜Lm,|`?͗F~)7O) fN"sD͗A`B5^3`Tc:@nV} BFY5̾v)HOmqd%C;6Hv"7ے@{,ĝ3=iș5`Xq)|Z]Z҅$YQwmmxr(WcCd-xHUXޫ:HZ;wiD[Wί޹/cjN5jW=NxG:oCJ4yUҀ${LZ8.[-B2?!y-,+]%:1s?a/fp!OFs\2d׈uJ \43'gu(NPK2`ʑ.ɱ,+mЁL͢NbhUR&q$.;/옝X؅|i}TGpLj[G_U,,).ybWWZC8l']S5?.>C>vÓPAXRs?-/ @[`KʜpQr"n)Ӣ:$(-d;(ZgES*f eTAF*cu"D"af,:]yb?q[XoD.əQq2d) 3[,8ЏXzpBnY9n}'qmbCl ]e(2\*ry2U]&C`{Fnz4 0s׮a=֋̌<ة 7.3XŐCd.*b B&+26Mo7Ouqd+!]&3)(q{ bŜyĪ#\G:} ,j==Q  |MUq^%wATջbNs+7{佻>9ţOoud^+oWJ%a9Ggrij4Xeo#@:d9[`ɋ]1?Zn0?ZmPC wCJ(ޤQGGqs3j9 ^ Q0-TmG[S1\wtWp)ATj0>^8!p XS,0yF.ӴVw?@GRc**bE1nuyrp.b V*f ~bkXUVT*|QW~(BSaZR|uʩ<`9%2@[XL+qbZSG QY!ڪ2 ޴>bK{WW9 ,vR8}qZR<_FўZ*UBTjJ)h`.(V70c;wV`8$\?;߹8~Ҏ uBk~2KrG*gZś/A?fF1-XHdѲ[{l_&!g a1}G2e>=y9#6wjɫ3뙙R& M4Ŧ&C׽(A2*ޭETydAz:,+7$ҺwZ zT1gTn zSݒ-감Dl﹆9)Ȩj !yaaKr1RX-~f1N jMhCVgg SdtAR4HB4V  )%JlZmǿTq.Ϩ25#x_^ M4ɦ umIֽ,7RuPtRQŻ5DL DnɂnuXWnI6Q#U)0U4RuPtRQŻ58ڻ% Zֻa!_mSQTZoh\ n+/Xw[Xh<=:}?*CpI^:u5#\ƀh6[¿G otƣţ@gvcBYfB< %40muN0&q๗2a4]X-J&Ǥ>8;An˥ṠX,rٙw{.+AJ7:/sO^N,stCyz"q*w;b Q!JbSՆ)QgP`2SZg5rrzM4-k8gUI &0?hEPl|ۏ⹊̯Qں&77Jq(PO\Ģq)gSqr` iJYXCl }7!;Q8\c] Ѐ#V BAdTˌTQE3p2V;D{o?t LICg!Rh!S69E !h]ʀSyyx_8/ 83^[< ƙgܐd @K8;>.ūu(xAdr5얈B"Q \qGI%WfPJ,PP6^ O3@) 'HL`5n5Rda>3N#Eue( ͌wV K  bkux.2RgođzBFy4+Ҭ'JB"ܺyV\n3r޻*<8R`8˘[23"෤XIǴ[1u=0I3u< ^n\ J_Dj=!wCQ閧:b)D1ԤF R, uUa'. .7p$A>P/9 o oF5y$lM;9oylP7f>yP /m$P꠫C=JpiWb,,< aQȂX׼32I+7;sғBYmͿ]]u_ΥW(t @}RZ UHIϞ.D V2B'0 $bd?"Vvg>Fǎ Ț<~K͟ :+u/O?e~jzvk +-`cP&QSBwTv^D}wf||= Ey7kfogC7֞n/8kφ~bdբO> )/(^8 fF3~ug{h\ ܲfz1A羟i֗ݢ4LtSs7DDrH fsw pŕOH-/bj͋e\8ߙe.Kw|ܽ/;\pFGÜr)c{߃`3?8I^+x; &yD/DQg8z~a'HYѳ^t K]{:7OF.X=L~('v8}f_}Py$kvg\?' W>Lo0Js)RNijkJP'Ϛ*)YSe4,)\ 5yM8MN0Afx1V[>(⥙x9YY{IKR8WTTUB*L kfC"&kq{9eDds[ہp &*50^f;h|OWr>^Nց*@jUwXE|$+=N{h^"/]lĽ HF߻^{% 0Rے ±3 M)t'&5}}}}u4Rm$qާ{A-u[`f>56,>}_ ^yjuO+{o*55;';A5dݡ:`ݠ) M0Is`Pڪ#-*IN icn CuDGkL㙕щu1S%҉*`,L?|]ɇr<21D C9A5QaJaUeIV΅MvRHgY˵Ii&TmDS܀4OT@ZLgGmW<gO3:9TF&y"u F~V l2lPu>(-NaEA I 9%A," ipY-ޟWE θ 0`^Six\Tvo^9.G~@p~z^ T 닁=mb^igw̫wǼ6E{Lz6S!ߎl6Y0Mx/H9z6Yzc^_ c/"_:9b@csrkJ[v4}7_PGq8(}VaSNLsuańBy6ks;Ϗi I=K$6d?M a&VYvDۏ< =p wG82Jߛ,V7R79oOJ @SD:J;^ak̢G'n%5 *~oKߖS_-ll#v&[.D#C}$b3{G[~U2e.SgOPu:楶fomMy'q@Ay8 H(kLN%yJA-EOh4u?Mo.lhlwGCZvD\ޮ^;.!ϘQDKNIۙ| 2rWdRSq%D&%}=Qpm^mb|*qNj't//B^*7CJE6[I*1ZgzshC}Om$oڷaZϻ!Zx D;gkpvݓd^]~b Y˲H[E_*݉Z/3V;vyP%{{rHocGSQ3}mwuPAx}gzdJ1`߻cuщ*AŽ}t/SJN}kdK[#kΡfMΡ7C;{-m"$4I)iRc @) EL8 Q*b"# svXq5mBZ{E1 cp/)*2=L_:\C|ӂI)ǩFrsq>NuTˋ剂% XaCqCj^?pmH3F:s5P[ /BʘHIH8$::$(RőJL NH"(a \ٝ`PߗR!"$ϡ$2 IJ$e0& ;ZKn|\'R'LPr8*B,EV!@!H0E@.}O;>!BDa1F`!B"̓$U\!E0J4_1*''"nI.0vS)'XP)LQ S$ :68g;ӛ$B!A\Q-g 2?'H3F3F ՄU^À+/CD9.D@#UW\I'-Jpo9i8|%EQ?.yߵqEgeWb}VI̵OK~&)t޹Bq-2jsnxrx\ |+lD0ac$-*-ᗠ ?ͼͦ`TauWi^fܯ^iә )!To˒6LKNw@g.:oل( ɉ69$XGC^:HFMflc6%$ j^a8"V@Sm\jlz+ XD) eFKKĦcWդh =} }M[ӯPd.i􃴇3*v-1]dp9O~+İyp'"0N2:DDJBL ƴ>Bo¨@b]$y J\&$*;!!WС&N'AW{JȌ=Տ=Èqhfh>d0󭮿\8YSOMC4S?MTˉ#)_I9!Ց8(21tf!_`)2 eu xp6oyӪyv'CC 7y_|W H6s͔Is-)XXu?cG 1^uĎL@A=T?L\9߂r92q|Y| O-̻B&G3dnj;PPqP&L#b K뫼%2Ø? Ʊ9#jgǧ1{x~ٓq^k8l {+!p("w9o~.!e|xg*lJ@/MӠY p).s2w4\gPpMm@Fь-c'矒̄nMZWMnu}T 9RV_K%j00^zqI3bG"a5ȜS!Fbfrx;|l9SWll9 PLH}T>Hv[^쩀pImW06Z%Ã-mތ5[2d c%y8 w渓)biAe=0x׏eΏ6 *G߼ VsuU?Z )kBro͵55?vdKH%n >Orjp.w["/u-u4CG#>ⸯ):h xu8X@JTgt~fovii9k*;}ԛ 0UxGNGJ- xR58 CN|[>Fs}Ya:cF?/ )ޣ9mN^xV#l)]@;WR[h8Ӏbu5`mΓȁ(swQ pa0\9ibBg#gm*zCMcٚ۶+=mg+Kfp&tNͤ3mŲHrdg,ChC,Awݿ(h]ͤ lw7@K$bӫ'  >PFIy9e<|;EF&>w 'W#"ncGIh!l"q 䀐!⦯\ݝ|Na]atwsui}c5 m?~xT[c[kǬ.U w`E;尀2Բ~ γ$y_EJ;vpu^*NKv/YEꮸY=*q׭~դן~y|l*]|\0vWwMobb&Ȩ^GtZȏ|16IMZf!ٻj-`eIP*8~~t9l_31IC$bIBF =9'SB1OkkrL:E d;JFfi|6zPj\X- }|6=ǀah9 O ,zvDf!Pp@H|G`(Q؉-yNp Ķ-rj׆PR(YøId/5Wi/]R}{YUEi@HX;vqy_Sd%W |*hj"b6ߍ۟ *!JޡWItj=B,m:oT<&;3pʽdH 8XzQ $$B ?=7H($Qi$vޯ,}*'tȺw 4?> @i՟21@D!*~bK! <HhPn$`(= xcsJ/agB;ZK x,yp?zC 8!CK{@lloō`!3Nizi!㳜^l_7∋rjn\S[+(4|9[OfӬ;ܘXG~,stBPJDlUh({`UTJfb"ƹ&R +3uG Ld亿0M@ƄZ1)cs#,Y\cѿ?ĂXcŞ썣h x9eLک\?wGI,0'gPn %*x2ѱ]nF9#3#)K1n95жuS81дr(zm xi($4J $Za B -=|4{8c(L!0'ds*R+X xY`DNY⚏kV^sa_d$ '?oHZKYЅ]*`߈tqjq>u祲ZUa3PN`ϋ")EbqpN 18iDZ*/3j]C6Y/n;  `ta#[eC/WfHp2c(@?EDTM_x"'XM<~ Dd 'Cnq(@*`2TJ2FyS+EʭiҴj476XG7iS؈TRyw`~|^߭$1D̠ZQ1U:Z"07,HeJ'dQz@s&OԆmҾwP1O嘣 L9y! rA =sHQck&s)-]e*~^^`ݶ'dZ+|TSΐEi1_+R JU?D \(r%?X3p<Kg3]0ePR z[p7_Po:ꝹmUl62֞U۫ !&8?3ecǕAwFH1(8Uf.L(i}\vn> m&vlYVi*G6V6#:l)-EPTeOp,]}1n1zB r~"DdvtͱTdTTĆPF9ROy͠qY=6w@ihQSLln囟vts4Hn?~++fQ~8lyLh.HD'BA5 B&ǤِSp,@Զ"Dhe;@z:C,#:1\\rr, 9"EUpM(&8lk XQaa̓M2IVr$b"pY!Ks h5eP *3mBB-(`!hb"!v-{1.3)95NCHȤH'1^lGcȹ2gLOd :}9 ,0$iN3hIJ.6_+Bs#Zr>%H08hCnQ+Htu< \ITpǫ0`!wl= en!'lb`qf@j^tAvEn*- RQa m94r$,5 Darj) MlʒǏbXw*%hB+NlqUi |ѹ=0FffVU˛\pHޡУϋ/ŨЫw|6@e=Z7vO/mj<+zUTZpmkg+CuE]4~7DП(]C.K#K* d"F-}8U9bo1#QkGwLI CD@H{[?P2d>f4 m"1 &e qs^dW`oe2ā2 HJQ(1C#fZ.FI_~b^YTZֻi[;ު}o~c1գM+o[OTnIe>}?ӽͲgUFϳ.}& R{%LTx{wq~In zWȬoVL~dv{ʤ%G#mnlQg_3g=tl.ݖKHmLBCճߴZ^.ս٫%pOi(񋝯R;yzRt[Wߏ'0-P{2[:l"qCsW'a0^*=s(կeIxNT?d5Ћ`@ڏ12A$<7֜׻#F^;&F #şeLq@NaSl<,06?GH#jH'(`ETHC^Jo6松]|[H!P'6prQ <Da>KM ;RPuRjpKwGAhԞ4zXYh*f4Ϡ%)Bdl0À \2j€S.Dv3?FQ|]rs9}-GmHe:3B2Xr͠JvҎ$L@Zj57=j*c1Qwu?+괖`zL4YIdlw<%伲/1j=O%iΏ==.OiE]7rjtU6Ǖ^;[j$!q )4 ;HXٍ=6ixcNo~%lIj6!Vd~Ofl{;twhsozZI'FCB"$Sl)5vV]M/9ݛNǟxD:@7sٞ9e=yJO^Ӂuݩrk̋,//27IΝr\ȍ؆_!뵸!K~-a }y!iv)> /loZen^w V}A$48*q0 &NݨD ,+%K2NO|\^ӥ̶9}NA6>-m3=)Q_S:={;^\? F_m8Jg8ڗˢѠv4/S f\[{֫+hHe!4{E]cv_ˉrC[PA֫/t(en d,]ȷ+OI;}>p|~C|Ɔd$͟>- & x$3-%yĢΞ^/nL9E6&Q2ki4ix#}oy(wzN:@SO;S'qqxP (%px Ȉ-:/7^u'IIg'Bfӷ'x#p" 2[ZLRJ'ʷO'JzH8 N1zޖWHX"!fSOȭv?Wo4E1L֫ b((2G9ZN/1G%CNvtvKys^^< Fz"@'ӝÇ o\%D3,q7>6AQDp}#* e ~x&MJ?q&&E#FR~b@)93ڄ1fYE_0A5@ GhevCZkuR{*QX$CK(dZgf5t|V~jR{(m{ 5O:B6=ձ&B'BXpg\EEM%(H "9-x¡RേLHE(_ `^Jͣ9({ܹ}H#8rd|PΑ'`n'=ŌBˈ^koX9K:`XpC2M8~>&>,$ GPYɑN&x &>`H!3ti@ MaAmјE\J 9oՓ-I`tNPmulqXBzM'%)]rYl2 hT$Q֩YB31$elz=2>\barpԋ,!촨X~%Ivi8DîZk`Ah 8ϦF Q HPxS *])ic!@aר0dl5laq5o*NhްkmmSw`мY+74kد%HQk۴$x;v࣢ڸbJd Ȫr 0\ * rsyc_:=~k. 9 ] Rqj:[fcog3ItV$C.%v8VĘ`N:%Y؁`[pU-8j$f eD*D%hC X+KgM#VdX5욍JgzӬh<[ ިL 1"p(b bH)ȱnjuOvC\ ӊ׋IyhsŇ WY 1AvZ_MZ߼u^fVjiqs,)7o߬zdrF<4DC+mVX ;eӊ+Ą ye75p&CB6pԕ ebgv,(hXMn`%Dv`}Qk6A DR!Hu6d-S IhKt^%f *4+ 9y<ƻVCMI$uzqq?ӆ zmf a ٽu㴜\i/gbjyGf̓G^K~h>߾y=? oo^Aoo+ Naw)0 M~ct Coq~M ρ~4٧zL糟^>Q9;~a`eӓ;(||t6syz+ Ş;2voV{C1POg%_^:pߑYO4]}`5e|pygW>uG1עP mr~Gvx4d2Wfzls@>^z.eĥi7Dt3CH|<={k)?xhk;$7!=܎'כ;tzt7lNиp@h)'/ >ߒ9 #i񀗚=REiGoJudSB<(tM.^@Otv=;eĞY!Uipw~_A,=;;RhVnӃ$F@*eoUTϯxoKRh(I^y:·R8)6< Bˮ:۾TܮQ}+ۊ>C;[߱\nol3}!/ExJ)}lE=;pQ|} XT gs)'\絿ֻ 7~~ڪ{g(~rrstf\B?WN_*{Ӵ!/Vm@>eF5-V߻U 䓏ׄS]O_^~8kW#ݐ$wzjǫ9M$霺܌$[aIH Im*jBDK)'&R!aH9ȫGۈѽ1d !F2ۥllHs2"؉ RaQ'9;I?ѬFl$AI\$1$2 Hҹr+{2(IHš JqHF^9\f k7a- 5Np+#z[Ne , {[K{#)S܁ BB KY@od:(#-$jPL8Ϣ1mtitouk˂yvȲڛe` @obW* eTPۏ :O`p܍N ͮn@NebYBdw`Tco q;%ng`yaA0J߼]عQ5-lɦZnY7|S]&"NO Fo4h\x_yzVzwHO]`@K^$DV'4j۔itR=y}sxunaY{vsr q{d?.ًDK\eȵ^Z09Z{ik@,i!K*+lDgFǫUIa7 ^M`+mC%7ͬW~ _ymW{L6b}:Fpv`:&bOvVY}Wsc!rC[]bca*$ÜI!J\j_~*sK^4M_&Ñprxm+6P=H|aޣ ZQP&RPJS&o%J4Uc)byzuV[8<7ՁM'u,<^^ !I,lc$|~N$41ֺ}PSL%sʖyjlx ar90/EAH {]EEj]󶘆|4`?nKiW˲6mH[u,-B3f_/X%iFN ډh`1@=ac:Hbdgs眨vV|̧_7Y_D7h޳l3N—)&>yRYIe͓.k,g6Q̱Ha 2RY,ȔsPR(TJ$t!_+4?Hp푣}st;S =Y2oI'& YXlDh !Qܴo壶k_yee209@4/ JfJ@%J=K:b'6\!%d卥EೌNOg͘ڂMչ'VkhV>\{7:-͠@t%H>)sr6J ! *-ujRڧjK[OJgR;DD 2x^Dc ֚UG 0uNlЈpiN?1},W׏rf) X|0i}NxՠСt4&g*-n+NYN]xvYeX$-febx`bucIY( ݋4KK 2ڟJs^{`mzjע {P+} h*/V3J\'}HkP $E{Rp=ApI${Qmd,6)i%8PR$IfLUΙR:J''AT&ӛVwHlJSst6xq8e3)r?v%ϦȽla8hbDRHc]&Fh CU{$8]?A0 [˸xۛCPIu~7-D7G rt_NR⛣L˧OTkr3]c1̠uǾkV40j$,fty O*<“*V'+c )>Fn.s((֙v§JH+T0=B]ZZ3jo^[z!nU_[8<>JXJa 5[TRB8?eŊe l/!Tbའh;53/0re2ZE*i0\DzjVO[^8ȅQP蜬md*I*D!aWKҳW-[EdU |v_A:[4یZd8VQ,*t2B4-84YgkUrZ-G`?ZbUPLDj7rdbMLr_)͍Zũ5R%az9_!bh@n1aY!O5mJI,Rd",!vÜ/2222yD/zj#w6p*FØ⹮؍K]vRjۯP$\ QUBּNE`_\\-yI|tWSٳ}>4}0!xQ(!Z/xZnT)irt%$P][v;@LۂG3zrl41uXR, og`yKv̎<3Ae} gBSi̝PjJDT`j!59?]d MUKzF`p7ͧ_7,Q:(/5a7]*Q<5iBUZ H~v ɠmO^(MP!0&{xY|!twHuK^>9ksyq2XПF3So`g cՊ坍]ng﫡hvWm޻a|۞1s1ңw&'Y1bkxh헟)^6zEx\,;^,^L!}.Nv;ºg-kWBNӤuC.ʰ}8'rXk/ȹn 8^m,4)hxP\嶥%0]O%'q6Jv]|Ѹ#p..҂Vb?eD];]EG(tnj˦>N1kOm@L\bJ7EiuQDDD&{W(5?@qtRakL3F$>ٹ8h,y2{mI&q[=.Q`!m䩳+hrkӂkBXz (ݹ@:4h8i֊,Us'SbI9EI6 !Zf  *'EۥJqKuv0VsV+㴗 53wiM#Ys$qx8*hkޭOV qTKbPZ-eT̓`Iˀ%&2F[ ?]$˖zfRth"J0!*v\L)kw[\w?,|;yKpڔ:Y8@ bJ5'/_q;*/ʏ_ o@.”) lViC $|DDZDuidqPYdU*,`AKBY/eo8ע󨷂nG3Cpꆑ嶜nӡx0SX& @E") ⓳wCp=&nM`4|\}惟>ﮯ1CwJju(Wջ`O>B*4KeKULba'M&]֚py$/G$;c'^|Ǐ!x3QXEA%0e*iI\|r߇X"s4[XdxX wz5HQ`j^~؛2MpMo?9O =*|!Lmu6I衼3ϼ[10 .f`r uXTqhgEǸ.ϙNr确Gj(- s\*K<*/w {ہZHR"R3p2 cC.b-^94aӉ#Y}ҼT fjeTFx$bKF#@G ـbī[VHȹP/h@:'Jmo)9IN ҢoHf>%Zg pgS4a{Rܯ/RCsODZ 1Now|fY_5+vFv@-Ѣǂ M0&=c&yg3БUGlLu]uYY,ʤcvYQG5dD6:IbDcoNk 6B[:H:~"*a. 0ar{G:&:żGphvpN>(ΔHbWYT`wة7cgpPk: NzKfa%hS"71 ǛSczflSS̺7idWL2IULbfv4Ǚ]† l>=GvErFQWm͙xÌ oG:AXuLxYXQyJ4l2" bl'D)E8Je$)iG5L(ՂJJodJN,qqu'UmGDX1\pyٔ q`yO3WF|k3} 'dɍj0Z3N }3(7O{l?T_\_Oq,gwJyW Sd.Ǹ{5rd >(Au,=hzqjqz: 2a}ҝB ~'/;Rߟa8^7Kg{/p`ޚǾi ߕYgpO۾ l(Atcޱ?` !LY׀5P_W^z\5V `ioӽ{oGc/[okZq7zsT^vL˸r.QL.Q?$jF-CUkc^Xw<;;+f (e }?Np;͘4k9ݻ,d1+S1ޘC=ߵun:e$)2Gʣ36Ҥ2U_,>X%cqUrDMJy4AXuou _ m0elQ501[ـ1|N̔@`) ҦoMv0"FР-CY pj̊9"$E.~'jm/&&b$J%OgM`ƓRjPwJ`9HawYņj6if\3&Oc2gSܘ)Ru+ /ґDy"]zY0,.TqQs5[tR`n%Mmj]Wj;Bz*.+P.[s=^HJ:gWJՉgKN'~8UʝE~LZOci\>8Hm}suL#{:"`r{KX&j,ÉtZQ-)r-. < v7P@hs(Ӟ s_\al&LJ+%DNj -sY:`P~-8|KBI:1p&x/RFmRu)B1L)0юik=BBFVРhTRy"vjCG6 4oZ[C9 =t֘lXψ/kUcÅLs*QD#:(MH걋>PH'MoI&oɐ %=ݚO Y - ބ ]I|kY.yw, 3ga_aRYs)rzd5B a [-h6R=D;U޼ͧzosujaZs8}?N{mNhz^.s(2FKMfLf!X|ރO[M48#xۛsP5z9=/lQm7o8I"yIaV atIa(Md~`BQz_RTG^uWX!B'w ?vm ;yUsRkmbkŭ/aNJ3.QU= )+Іʑ*YYK CpB%#&PhK ;&[ Ac8B=rƪ€׆aqVXP,#h{fܹ`3[*mHMBxE5rT=/5^:.Wu󒾀.<Ѿ.$/J)Bv/T ͩQq*LjACO,¬t/R(h8=W!,w{ؘ!RWQU=~hR95VOmh鷵{0 a* 01a'xCxcqD/&XLZz?wE 4-gTWl`U0|z/FXDM~0¼텢*-Ώ RMa@*Lm#`mASg1z}I#iD+w|B1N|YWjh+fǷl;~c;z:1S[;dZrwWf¿i|#&7iDVZ]v4sȱ}|Y,݇||?,*xof Y[~9Ro+=G=s)Ĕ ~OIVR5wֶ賘OU\5=єUYY{74Yڛd} ;;40O_ky{*d~?mwkl6sT+Ey[20\Iڨߜ~/kGkx:3ok[jXv=yjӈމ[6b{2sW[cM_x d8dqRo߳q$R 3Znlԃ{}4{8#jsq]Tc9H#S2viڪs1Zpg]aڮ_n|O[6\Z0'qyku{熫Нyk &nm{EmӮ ^ܢXP 㴕-Aܾo^blCKy$/5Ȳe 56y~[ȴNLq:ys~\ԘR1PÅ|>4QqufL?gS1W/o80zÜlGI@z9D߶ɭӒqO2bq}^Yy>3(t]W?ɹ~,!>r B'ŗ׏_\_=PUIcj,8cV 76ɱ397 "فC^+':2L "AC(D9HJr%O"X !;,zZdypԩwWn)C 㪧G?;zy,f('RLN57us575O({.N _)o:/Vn!鰬U( EwqlGR6byqQmpP_/m&afVDàUX%ZVсnԷ [#'AkDMTќ #T~9-u6Zi>p.Wht9x?_~"22ۖkuؚ?r35n< я$%FތG7]~[/w%x82[wc@⫒W%J_UIy(Hmf)H2 !ɵA Kdsa)6qD\¹xr5fs K|*&|2_ HwwK_*=2h\9Z޸ʠ&`LrOnIfήW_[t(-;3_X/a %K!fzUblzy_0(Kho 󶌠_wW:m/TЯjsؕ Vw_VD?yU ?1a]Ȉ9)s;aT^Ι6 O\dd~-͸5 yAy,@a (Dbe!), 1r`Ѡᑴ(@1!SvE 4VQA8)9a[*rXgF"l,uFn'R NYad0*7u AvQ5QX+ āgNf9.8qNX0T4A4 "&nE&2)e̠BBZvZNYdI0t4.PY8Re<+24yfrCpK[P,@x%N@h@YQ| Y!uH2@ dDV(%eUAz Q$;9 $*!A5Z'(d|(2".Xc4BŅD6ǔGL/# kg5rV~zpQ:WLo&^i5p/:y7~UN^Ź`_:8YC u O_ʣ׬ 8z[>Pw06iCïbW*/m΀KsdwCʹϽ[,/+%U5=R?` V+,Cv #Ve  P2s0iGtsB\怵kj/(T9~JRkW蘺.tvcDbysϵ (+p2N B57L!G[ Dn=׎~.YnHLމz0gkc&.Ɯ]>[ qv1\arnv18;8?.ŵ;W]L8[Pzv1albń㳵A]L$/ٶ|b5<, @/B‚ݧUiSR+9 %i2,/~glkzvgkS.H]LQШqbJŔȳ)%jkL;:/^71xj.~V-+P!Y]/?}Oq{yct@*Ras KjKYV`s=\]"H]\os8\1 -0Y,mx5Cd\$/[j}$J-91?cQh '`1%xiAS` lr̸A!]~vO^&YYݙ O-ٲţ/Vs׍`g~-=OeߖW#)_vǕXæҚU;s)lldzXnFxU}+A{-:z2 b;4ҤRTڈBN5Qվf}1k+R=Qa_hMV x>y\A(չs-aqf}"sDdF>Kٚ9kqMSg PW6VT3rn=y8}_w3gAG1ihכ\=U(mb>0O˼7V(uW;Dz(w 6Mg_}x=7Gsr뙗 zY<+=x4Diʥ}S~ʘ[ AnnY!b~a8YԋoHJ:;F!Mw9Q!ڙvO2BpșO굆>h5:;F!Mޥi74-9z8kn:;F!Mw8}wɔrJ9лT|ٔ/T3RSneʭ ) Z3O R^ * ,9[`f2rRV@Ilh[`ߓ{J F:傤\QļP7TF]F"8ŘS9$(&c&i)Vzp1f4)ƜbA% 63!JcN1栒4\PI1sHIt{vg-H1c* n1cTwX}bgczĘ:ŘS9$0bLhb)T$EË1sR9ŘJGj1f΅J1c* ͘12IcO> Odk(Ny$TcA82--q9D$B4S JJi= )lXĄI7Qco2 O<[X`Sk"{k ZEEB( @ !VBjRI@ȭ\9P0't0&5D :hZc;3!C E28' wecГXgB ߆LБca5zAFC 0qJZ)2ПÙ`|Z*CT`@B#zNƞ`#<ǢPR,#!)CVsdd*6(3BHD2=B-BA{ tQ yCy[s)`3䖢`ȝu(ЧQcŅ6"!ˇ 7 hYkR4ak!64ἶL5\1Eٖ+:wG?\t(B^t4^]:&Q7߿͙LM#Jb"t]vN_^ֺN; G?w 뛳ADB[qo~*bg0M;q]S\Vwtpi&wS~zp}jMT Vg\dH3|[G3rWsҲ{Jb^DC5m-&`z 8+^10flO)( 1wOUEOR:cƃO(@BpIO%|ۥ3&|fs.Iq((?~=v/0FKx% $Fy]ZΝF*w3 iÝb =@FBS麙@|9][dYri#~ART IIY[+>o5cs~ORhahyJqC b[^Z-]r޵jP{Uv R1bKԾaAC5qKA(tQ0$R'A Ge>p9 ITH g%5BY$ < ppZ=R4jjj=Gn%+mmw*饀*^l?Iz^X{a37TwvMޫy\ή $|W2c6A Y !tb)RpwrRqpCF9g7D +t&jWiM>g; d`nFk**&T;8-5xۻQFT54SI;)-޵0Q'u~4Ng!&>m? 7ϥ N(tY7u~yo,hi 6 Wi0W_{5y3rV6%&)0e^̅;,/*7L!)# ퟃ1(To$ƣ0n. ;'noFaޙr[EOzrbu_$6|7sٻ{aQӲ}Q>Fw Y0J6a %\ %3YOR#UCuf(gK'pLd^I+rG Ie!vDTŋ䭦7㴳o~DʓH&G\.\C4%W.X b@Z}?}<zsR) y,`|i=/aЉScWMG]a祊Q\i-[Е/'-ʄV%UA2 ֎~rbC=|jN6bz3/R/0uÖ{MΒ R=B.VKI1:EZKv,MA,|eRKXmr5 fY^zdp JT;k; G,HCH!Z?Fa;Gh5n0}RAĉxιRͮnOENPm 8$! F2 9DH/K`kSI@_b+40S!P"ZUd*kGW"Rg(``3[T9A4Bc\Wn&Z6>MYww;O/~ q4sWsrz}u:yQoz\3CeiZh̥֐sZ!c#0;<2:Kĭ$ػ8 :"v䊺a$8 3ɴ߄m;HΖq\2Ii4!D\!ZФ5Sh&f:و٠HÝjD|>H\´|Gs GO1cy49,3L0:2Ǡl3YGK9{B1,x&_BkH$tz, BkA(h0FP䩍Y1pKF,**bG73H1vI^KJi{O ˲+x kWE(362HhjCL85 eݟ D__k4z3/Px1= 6\RNU Rk= :qcRPq@#ϔ-R }1hg3TL(Y}`q" \?)G"i L<"TZc\s,wAUHݻhFjZG=EEeL}aݾ?ݰ^&/R"%/R : 1"Xke m18 Z%P)!*1M.MҼ:EiWjٳstZLoe_\ޔ9 u^ƪfU70s&jҡsE*Z Brw#=6V:1t_@=M{1toc٘_0ҋ Dz)Z0 2D@ZGfNL|@VIF!j6#k Bad JxV6R)P6h+2[iW%e_^5S6p,D8 I4/Nrqɠ%z$v*j%eɧQ\qV!ecldU 0΃P3JQj_`θj=BTQp(!Ӛ4Tpo@9ꞽӯf|^p[3EOPkȫѵ}4Ni[VdJc2O~_PάÎ,9BX5zPOAm6OAmtCP٧d|wg(oP\s 7cKg(('vitxU=uNN18;p;gMp4}Ƨ_m在'>\#&>&\k"-')T?K_>1xdK# ]bӉ(Or㈸T >)hG0|phEJH]wHy±5k&'TA/':݋Ӛq7n5۽@xO:_HtgۍFW vK liYlfpxؒ#ɓ-$m$-uUE g:n#Jn˙M8`2`R[Jn jU䆌JGUv0y~Il֜~chô7Z0XEF'~&; *nh8wM\e)sՠ'CŧY zQ=yK3h{jГC=pjГCAk@_F__C"J _G"TBeՠުjԼ2W1-d]Teʪ@țlvNJ\[Xl\E*|@1y>RrMT h\kiMs&QHbRq ٕY;t5 |H@Jz U鹍OYRQʠ ee,IV:H/ؒlԏ&UVpPeu 5IYB"%Rւl%nU2&"Co ja(0N28lU:Io Y3 R|]KFhIJa6ϯŌD#VT7[fcqJ$]Gğub21:I=@ ,ڲRP*bb7 CKlqlf^[N@KiY Xt&excHݰ%F T+%<@|B9b(?prtSO/cc hאcT\oZ y||\J.W; +뢞ت//S>HLg%+ڦUy /]Mf_b;q突D9kn pƾѯI}Cbb31AϠBƨ#s ,Ldy Zݷw1Q!-'YT-pP ԂZ? g VOkۅY=!|a9B-Kt!YW*Xq o^C}>|A3Lrj8l#7w' lp3Io;Y,qw) {6i+gweCN_CK@Ͻ|ש?4?SStp]&rAhn!3d|_C^0R:Ӏ)jï '/7{Ҁ?ξ+ }x Sf!|+_7o"^_u?7;).~cmB ׷l4{K9PJ !oܳ=lϝ Sji݀բq֘m]DLctX<ٶĒj $UU}m&Vٜ[: Aemy>A =8&mvRS) +ʨĮ5wW!NIܭO[xeAhc&Z%v[uVau,UĝP3:!ÈBu)xA.$$>;tߜL.k&ݱ@̓MALp2Kxx0$NL?ԥe,xdPJG@%ܓ|F" ={<^23Tퟏ18] KUNyv:s^n0Tt0nߏkyZ{ B8cV噲<\>bߩ8QP {nla0 @;=u N;fgu> `Z.sU*B~iMMN|tHRwxC@d&nR3X`?7*#ʳ AHl tocliEcvqKM;2:PԘV`QSZF6Q'>(\jv@礷K!Bi N=OQ.˫>(׿K$əW]"uFM;lVi:) s3:h1sɅ,q 'i rsݓs4 Oԙ:lN:nX=zp4[N{ 1puLN)Q?ֲY#R6o+ߗ?J 7l-o˦{/lk&v~%dy)Y>,//Y:ȹm2c$ֶI$oM!Co!Z֜TqMoĿ9j/">au6voJgO7g3UJ"n]yƄl8zYc+#1+ cX9n%t;n]#l2c#- rڂ0dx؍F)بj8"R4]=\]yjZt.|l3"m5S<8}O1ZXQ6'P},5ʵ^,knӶ|aVȻ/TJW;/|B_A(RZq>ٌT Ȋ5"6svٚb˦ <#r2 ʢJHV(/,ِ#LA~b5ϡyɡy7u E咤y Ff6!)ѳɕ&X" uծ؍Fc7di\!9qnZaS6!"2`+b'ЁxWVm)!Xx* "YˑK[KbEmEͱa?a[ eTǩٱCfnX׷GB æUaa" CgUqz"UY%:k;'N|?b.1Y]#&jJ"n٧}{qc)(Z}dbW=w9zM:-6Lx;9Lۻ_nD7Ѝ|o|o,Ž|xx˿!^⒭4Tu27([Fc5OQIg,1m q;os ߚR8݃Nꡐ7o0=8zymCfyrn(냡jcKhcq_q 8\%;wc:ƗL65nX:[k5TB=FqF?}!Wn)B8:QB!S6hۀ"9fڀp-璣Q? (/CYj+w!P2B'iR1HB[Œ!(JQ H!+6ilye!,ARVl(8HK@!8,+p0&x^{=EeA%EP6cFO#]-+&p1K_Z0mlQԶQIhji0SSY\-aU&f]5 {1gxQgoߔ ͛O7߾@8vwzq,D)K +,KMVїj&ۢ˨m 3ёTl-Ƚt6Zxm!焊8Ee[ ",DQ$70`na,|x^.D/ßMCDyZpdQoOwq$N/&úߖBJ1x54Knk|X.Cs~%6B5y7g?NB[2}ۿ2;@"OBzV:Znmbm,bTv?./yVrLUkiPrc9#eKm#k>gM>cd :Gz"IZАpz^ RD1IPd Lf/ab d2{W ve# -LzK &6JY'L%}0}mϚ$9l^🖶ά~.dGaީ (Jk9B7nqPώ}n7@ӧyM`:Et/'́YRI ZYz5}?SZX+͕ s0RZt[r bK@pB'f86V!a"ЎXjdh-3tBcfYc01FfLu쌉V6g";N !+YÙ LP{a=\5h+`|mߚf[gR%g弜 ѓѴmjV[ acmTETR VT^JWt^BnHlK2#*(wsږT[lV *r"jAfZubKr ZaL^E\lu'JyTByi~歷ۄO:?fʵ%@|qt0h穁#;0af/ ÎNn]箦c/B|>},7鲲DHɵ %,6gS7i++$_I{A5o'yY_ӥYzkmXE0.rBT$nܴuS\ԅFQvwv)Q-K;ogvfvwi+]_'. PFlXXwU+3%9 $׮!ɿ2+3vJKX2Dޕ,ZZOjД@yWf @W@8\6ǩǎ6fAF&A-ZrRH%;\Aݑ͞( jz_@SK3un;TJѷP\ԃ:Q&tqڍhjgwdyv#=M4>݈"/n|G}2 oڦAȌa 6? SكƏLzwaf#=0꓁޺BF$Z v`j G_p\s_5nД=~A$8`{׮1JQ\~ˣX{ 8R' Mi{n]Arv2z+_|{`V_jߺR`RcqyvQTKb.9dTHs1بdҘI̕[j% =7FhYߑr暒+u]3(^ Y/pף S3t{5˜Wk;d?9 FuM1GcnfLqV'NH`z8läd4wXy:نXf-'Oy2yX4.ީ9Rt}z> `>7T.g]a)ќY@75pDp!@3;@:8 h0//00G({o>[? t8ٗ-U&)8=MJʒK\ emh,tv}݆bKl7@{Dm1Sf{;Q\}@Wg\)35.AF,Wq/GsÀVgk ) )n!&坽. mPrA)Qt^i(u+<;rs6ɒbRl F< 5a+RU] z (VUU޲%D2qWJR}JviB^6\LG5(~pDvTdIVVnSdVs*GfXR*u0?:T #L30S%.2sb7իB Jwx V>:\X>v QCVNj$ysTb@)ZU/JݕzOf=),)*97j[w Z0QzkH+:%U_q#*e 7R*J.YٻceR*,LQv? JDSDE8pBJ͔yux511[_(Ggk yZ__5} zJQJhWVapUTk.Jͅ$wl"_pwQTf2>/NXӕ}]SXn`]z>W0h6{~Ƭ` hvcQL>õqE 61>#zacXS%p*VE~>\$=cEpzVlx{1Sa(alfpx̬!6H?8ab) !a d0|VXV;c3:gSΡy$6Q4bh\CyA0F,0sk9Qg| 9 ɂcLjQ0b;fyU=0h]> =0X" ] ="aUvTjηҚjTծ>l/ָlDfpy.RiB9粅Z9ܢy`0A6e!ٳ91OUYDbD33UmT4ZvtWzԟvn#=:ԍnW`Є-0g>OF-,RɨэQ{zcYm6x> }W5 kwd7eW89 î%\5S2Lȗ+&\"A檋fJ| U+P+C;+gP&g+)7;LNK?kmԜm;h /^{ߝߚ FM4ic.RYI>X۽ 韟A_LJQũ;4pp|&nuHwM~t\Aa6ًч 'hG7=t~M%x4=Hy&dߘ2n/<~RsZׇgzs^=ކM{;wT8-Mrx+p'Z/)g_O`Hpkʫ%"WObb].3دybU`|/goQñoz~H_ mB7P##߬C+˿yWh榔Axr4OMc ~ЈNkX^ܵ /lYjq@0* R:ynH{w|lL)'g. -g'iW'vSY)rIvؿض`XV^di7dMaCvb?Y;~qO01jx c&4zks2nR& N ѮNGaQY鏑lVuqĚetԜ4=F) μ5;뢘zZ;x<:wV{gxgv&Fd<k;J;zyK:CΜʒ}ӏӹ-s[" oNȀe>:(j :9Xӆc,x᥄)!#vo>KOmp;f2/>}ɛλ_ߞtNjqcye`2)IB1,Ak0@HsJ}OTtK#F Ooo];-ކrٺ-Z-<M?82p' & c&*`:&PBL`;󘎤ҁ#sg)y>XHsy2tRB*0U=6"+AyNQ\Ṋ3|b(T %$yv_Pw`3W-1klyc6wdz皕rr n䁠 UEJy*1Q i6AܵE;*thǻ;qr&H`>zZ||#IP@4 C1qn̥J("JC,2I^oCA/=<'Gp g ״,g[Hj"܇#P\!u}y\DAƌb[pИ ShocԄ)ߦC05u2X%tݬHQ`4+TJox:Cl@s@z$wFC{F$23 k![%<`I+0iiPÖNɛ GYQ$f6xY7WGKEP(>7%&H.NJgR !b%[@" j4?EX).(GT-6Z6#y|~ASV5g@ ]lN"J1Aܬ1oW ^tq!9&8{S]0`%h qIA"LAn1(R*UhͲBT ̟X@J8`a"9L(< rqO3"O)SxW|tL~nd u>ܖ?LT,{w5kzܺN>>&'ǖ| f0wx p2\9dUlO* [AHp՚Jӎg,֓V7R$ABGXLїđW.%ۘo]2Y2}; 4^ΰ\AǑ+|ѻ(xBե^úƫ6@%sB%[HslSXW3O|PHӀ0MqGYl\~:1ĐMۉ,JejMC%ViA۰H# Edx EW("tBAhvWG X֗k":V#b6sNBZQ~`U4kVOmgH4vKDÐyhAɿnL1Dy_6e h$*)4.xN9TiF?"Hq9ϨN"닜_]_7>V3U)z7fIwEr?[Jr0˒73 glRZB6Ѝk{|y|%O%I^G`6VbU|c/1V#Cd]W(Ƴ"6}bfDuH<q;~*Sdk+[xa5e;Z {SԼjwkXo5f$DfbLeTY3&׍-b\uA겜M>*ʷȨ[ YJ9`*8H"ɗJ"$H1X&"S)1b0/Z:xj*?Sv*RgsI+ s*KsXQ3)aa*JThr$X1\XX/U=8 $cYT$B+3ac&1}Gfxx^$A{cfCֆ,')1g lJ.$aK͖S1QAA ;qE^){Nܧ14B ɟiJ`;&q38 8b<\QOrfަjV x^/]yRѳmʞˣO)u9J hKۗGf(R*\Ԡˣ0_;25;$yOZ/,u jt(c^T5 QKTX7paQsUb#{\  SXq`6)aW~AbNH r<>"yvHՀ*/@BG;Nۧ>VחE>_ӷ<]tgˈ ec2~*Z0}|>~|wq^I7 w ]mꇼO1}̩M>G;xsZu7-н{{8EĘ8Q>f:aszbngBnmӋ]Fd[чu%TӢy5`_M;zTYggRI}V4~^+Z&XK.٬K `B$ n(Owju3NѹEUF (6Y@wʣzzi<&c 87|efWx %!+NgUZmWHZ0;&Yyla>U:O00"-M^BGmW|!+$3tI Ekso 6>!m*7H4WT4:sIf2i)"SXBD4'8ŚXI២H%rKUoN!V.SfH QIqBcH?JR32I$w8\SJ.-xo[ZVA}3W"TM qY23qxTBp! ӒfO2% b(IXr@l5 yۻ?ߵ诜h^4Z)PdS??,7"~Y|y>ME >/F+-+R$/>(|ѧ@ݻ{HS-mD;qHF/86z#KDd$_h/F$1zyǣ =ͧ3J7wed\X6Z,Ov1Ls%Bd֍_K 2, /FOo_6nh ˗ir~9Zjy-9J G#q5}c?Lg]堥O<2|::}1/8-WΧi\㲛7.-Q1Yʅ\|ojNg+%һW9?GR nv]^H[{L 3UKqņ]Y\*b7!nۉit`iDz3yXx7ܱ P@sFSABhg5=V9A,dEDФ\J b6;!Nϥ7uP-H9 POϱM@3Rq,"ЂmWj9EW/\Ρ(tRHy,^!(݇V2NDi_:*rC!ײr = b4ʷ =U5Ό >UqWH0]6Wx A;pҚ [K\rDž\,2͐*<I(i3M M5I`F)g"T4!3 _n. UC .ք~O[vGOatO}n 4#Fc4+r#`;%`e-C {p$d>"ITi}+{~^%[x ߗ2f=_Y~D9&h,fX %4 \PP8bBͻ@F;ZGZqgxzgEo>j> $Ͼ˰o2|V׾mgu0]UߞلEDMaJMQFu>,6=ʂ^o{J~<:8*LĄ'u:r5 o `_ ELI_fp8ldo8t$!֝>k:qϩS;!#?Jg8ľ4 Ǔ yX_*0/>D3t^&C-scD Hb6m8 jCԻM+7hQks N7^,j`4exтn&2_S6&(Y3'DLp"C0pUjNĂqmF iyp6cB* GUU!TJgX"a INrAh%kFЙ <(}MZX2R-|,3',r4%i Y&XL)G_I* ,2".wXIT]@QA8f/ijTy&y1^V?|L^ ŴXާ0/LQLL=d7O7F""N#£[ۈEw4,B)#_pm^G었H܍x:˳h/tf#&,1zyǣ)`:4.~Y"bF hT%"زxbnԮC~ ҮS[a:29]|&D{ȻMEuWa~ vqu\O#?hGK%ʈ hJнVBt]qP?m~m[bݴ:,Rw`[ڦ g^)tƈɀZ,TN!P `H rrzx'QQ3P~_ǫή}Ea^up`zud(G#g$|R_p O@)4Fu)R:&48|l+94~^+&XUJpGɭ~sVq-*3eL ,R@SZH`۶!pQ^q-L: EM^qb([^!9]YoF+^zXNdF;S mN U$%QMfVSs=U,חX)8<7Z)8,g5.'$(#$hG[,X GbZCv_!$XDZJW '#h[h I&e)B]nٟ-;v)ijMźJZOñՂCy0}GlnPpoᯣԨ[XT} clرc* 7+-6m U苢+iw~Cǻh>93 F~0F?6)^:iǽZ/ѯf8cqz>t١N>&_cew3T$l=[WyS\-6K$GOnZ?OԜQȞ$,tڬ|k p@/aA@ KqEGobO]ZODae6k=FPۅ:2Y=;x!О)jv`TQo @2 %j]Sb@о Y_MYf̈7 /'rbgxA!8~ gĂA0tcr4s܅ٮ7NB_ɠ4~ $+ē#ϺまCm(Fw=l\ޢ恗#Fh<.N]=neQ%Re/4Kn>a͞fPؔot!Y7}E2fr)޿]-,%_0{Q&μ]' ߊV5 u6:̓Xtj$qxMx& jvsaxMP?i?{u푅juePxdk2fWqwVLdtT iNZoH W i1KʰRT2yκRTu[er%E\uXfpX-?ݑ!?WOw>U2Sv7s(#9t%DjwLI+b; , bwdɼtR/HJS(#&E%uR UYJ0Dy5"P,(.3,I!%ퟖȐ`Z*A Rحn4dZ#9)||Pȕ2@QA)A0`zrIG%T0 :|E|0@UxGXB% \.}ąF@HH8K!&`p%"R5 |>=ބ/W+_Z+_uhy[ټHk9p{uDz#C1@ӏѷ?L )":!)P!n> %~ߍ|?D|妡7rxf^BjmA$|=D$cG$$ Vesf4`Dr/_9cO@(곭zУ*< ϡeҨL=+n秨# NU۸]);A) W5V۞pdo# +ʓg0w8#*5Lk fF w41at VЭ o쐰@_+12 "ufം}5Q4VM" KV'uxrQu@iIY+g/yh%V x-p# x4=am280޽ҥŻ_8'sW\)X)|x3\;`&7\\3`CD,D)䜶,J̓SƂ&B+2f͇ӆ),\nӵyڌ&*!"C~ j;!1H{Jmn(YHP59,Lf۾/cX5me_ܗ58q*`NnK-"XYKg=t6;TJeT+I^::XL}yKOUo̿Oj>bɵ 4sS)8@vLpR ՓR)Mj8iIù!C!US_G&q[T$HfD\BM]9 %*5* M59qbqb(H-qi,A`3BFW QV>=Ix@1B*\@ [^9cWRlFIPWOg#DaF05;oVcҤJ/iepbf%EV)trP;~]kOUs~.6SOá/q=807ACý>R:lQOĪ8Uae_oGAi*ꆛEHtm6fD*VD|{Oܕt4CǻAC"IzFôџ~G;怈m^Ozj(L;Էnb|~'HɃVwOF#&8wɫIeFnF2< J]ܞM'֋(yNa<`}--5eK:'zG t_;w~כGUHu3(btF}Krtdܯ`ifX@T'T@4'aҧ? γ?{5fkx85{"涡nOiLS\i7܃&'2^wiVz}ݽ_eo66TB > ),/,*EjmyT+tB6e(+x Yn gGnd\-UC7H!5SfՓ;nGv"y$9\/}Glep1tG uNX?eOwC0섹EU~f^t$7HuP昧W Lus5#$dU\̻c.ͼUFnK83W+ƣn$nސFt;UԩmxC~זTݞ)in^2SrVK"*wyB/>P 1g7H^2G#Eƙb|`*ۋ+Bp%tgwqđnԈ8n̈́B nߺ6P#c=-!hlmj_=CVT\g7q8UGO=ƻ{;f8g6jw$ \KtB gS溞=z{GxF,’c2Q*ץ\QPz.>gtD ؎?w<:\}XҌBB+,L/}yR!ԉT?M桿בBnfڷl1u^$Sziv~a-C[d+'7iʷp GPޗ0\ C:݋oO-'MKHgl/Ԙ%(6kB ud2G)=Χy[Ou$=*dH,]ڗX>A>;c} T+Jݜɉft^qrLJAϤtF{y܊R2h)h:OE @K@}ý"#c6C˳.Rv8ksLtfZ*B} !z[Jߙ}0H,Nз_TOt4C g ѓ̍/3ԋb Dz^XjLvZ āgCT<0q?Em1  LL3#澊Dʎ5#i]J :5yrrhHDr㱀bW?B|UƐ0jCSu!KSL 8y1 ͸G yGGBT#}.l[H<K`G"@@%̅Mʎ%Q1;uIЏFꜻO]H%ӖO U_y B`{=O-+?bL8[zj(g}EUz_@e}O9j&|DKQO)GD3˲~g`y3絁'lFQl/@uf?s㛟^@AJ2t蹁.Ѫy\3Znde\+5,ixF Q(PkK:qޕ9).tA*O&\'blYM BXH+2[F@z+!&Z[rJ&8w&2_[ 5!cHn%Ќ91˪vå'mPDtV/*uV'`F9.u7^O>W|܄*~wc$8gQ?֏ ',X<30ÞnԓF2T9%xکSa<ԕb}"w<]'p vE$uuY(cn :dm$p}$uwAs} +]f='94~μ}Fxs<쳐 ژ {TIP7UeSuۧJxS?/3Mo/}A$0I̞Xj6I<=xA"c+# 0.s\E%n4";+m#I/A/a  aAdUeY( {<Y%e*Aӓ3S]k9 tLt="|H&KZ%qkuSg3 eMcF0T7e^4>t#3Ѵ-ˇ&Ͱ#BQ%t9DPm$u0gYV5#};P$TNbY`2@!+/{;e6y@n9;]F\$X+n'|\?"ts-}b>Ԧk Py]d-3 Mn5)=we¸lZfFh=IC#2n2 Fn]D0B}zOݼX$*f=ug0#B\6tM&Ğ2 0WHeA}җ(M 6( *" ʈP[-$JT͚P)APHOHD2 ǯ).u-<̩kA2;473W{X_i\e3+PW,Ⱦ`S˖:&Ѥ>[([MiBYSdY6dJIҥrH31nA}S,:M7)uVpv!6PB^^\\2yhQ~:e? 3 IF}kV7p,J槒(]YmX&]=,? pYa!w3??s g %ncT.t@0lDhw.Ɖ&s =s@@8jNߢ9F*@h2%Εk>,Huݬb~qvڻϟ0kޚc7HGOiXh`0bsBI?|9 /Fn)W'p9s[j'!{玺:2H񾦊XL0˜J"inz*g;[3:Bxr!1DHߗl ã,3詍|Ŭw=쎠_*^IYF+zZ 85&m\5|s) ߉rm0}n }rcf+qE%+W!h52RZef+r~$t5nIU 1*BίE"U9םJLK<9N'}=IqKYma CViIKD3]#?Ufσ3ZX"QD0~alެdZ~'Hn>o9K^ye0WC4;v}}Jl; QQw:IJ?ϤJQ}Or0 G:/Ɇf;m@3g=ɜq@m~fyjc N{;H6-GVk~us(HMxnb c{ ٳȡȡȡa"=r‚DbCjq15BI&18DE:J&!66]5up;>)l`PE]>0;em`1_fmG`~uxf"q%.'&D(MKHIA:FZ҉DDsEMB,( y!"\΋՚½;SP0zym{zfq4G`Øq( r$`0=Yj5T<2!&ƐD uΌnje+~ė@KeېеCh˼i7?2v.ga9uvW:0Ά/] zJ;$Iϩ}F>$t (G.=prpYInҢ9&.ɚ['3>yk,i&/7J֭sو|h[ 'D!ܚU7ق5Ʒ8RT5\]ʼnLQA:Vh׌7S3h 6sj=n-^Ew6k*2#ك4g\8w>JP-XD >eq@&YH=L F~iwl1E5[ŅF,JH%%aSh|a#DCm Q W 뾸'uD FP=5* I%,V(s+D*pFҌvzQ`_@ \q8rƁeBrO" b x60(a<(cq0=SL|q?c{q.nr61Ui2[ 8ݒzkbzhf hΩx-UU(ҳٞȈxܭ?m51})mςx tnw8'3x('&q+la|c#~w[n:?^_].+) 9uI$D&(O&2)`"rHG+q~X5f 9tY/!NX3Πu "7\'$"c/.k[0r/r-_6-R2i4+^ykKAd&HkvGnnN5U{ۍ+Z6jR6АI &Ӯ5 *sIu nH%:9r9A8sjn']t..(UD X+8흣&ØԭJ?Z2\ɓ;z%WKz]||Wrd>§ 4䴯Q(D lh#䒰‡Y&U˰յ;xaoP)a3V- ;;nS\w`E\€4++!2'ֽpɱ|H^Ʈssb_$ӑE>]Ԡr͹Px6\ &cnKM(´奄GX^t&7?(Ӷ@ fnuoJ=,1?{Hn$e?,ffd;:ag)a)Ir@Of@ %p)TQ#zkWt2`ͷ#_2鍧͕tu$ǻ/v4Lo5GeV{wo"KDfqCǰ!Q$OpU]Ht(9.Vɨ8\*UfDrdž,_09Nt#{tnxȒ$CF.bE$ ><H3 G?/Y.`i!(1"C N WLEaČ1v?b2QjReўC!w9yhp jz#"tp0%[}@,t mkӲe3$cXbF!q̣wߗ<Ilτa787M\Ga,1P6@DX%w" "zVZ0 m9BNٴW dAܽ`C ]; yJm>\A-:5ܬeѪLd"Y`kq hPƊGG`N ^z覻8x6 %r(;CHS`N!NCy dc2CS OkNzIx|xKVr7UnU] Kޢn7|%-f(y&8BRDBey!( Ir rXx80Լe勁M ;\[ތNkY7,&,䕛Mkx7k [nؘ;\[{)rHjݚWn^6un1nRNd{b`~ w0/Cw X+7QbHtʗ0*}G (I)IQg^; jZRod> Ka-1*N߷G-@>NÆ`R"w"bl~  *elDŽ2, x0ыI DWdKL-+%0 H>%;heZ:.(la$R 5p dI9bBAQOjz8^$ ۄ[~VOG~1t(n?xZ ue? 0 }~TR\Ԁ%8z347΃NC4\O!?Ɲg>{+ ~ncCF:3ݛlXb;XuWWcuԆ`b2 {[RcqB/6ipBnD/<1Z=:F4n X+7-6%0 jw|,R<>op N )}W#ӡmyYϨoI3|/Fp)nNc0٤Di*cef\iqO;9~,\ngtw`a^Ȏfm,uЀ)OyNsNo^敗Ӛ]rVh awy IЧ9yX2A@\>%IA8PEN3+i s:-=e ,ˑgw1uhIb2ްʺ>/7862|gB*l#8qv;6BU 6_J|lFdC߾ʦznc,Ye[|U=~W1a~ז7:Z-`O=/4=0ܗ40,_éC5ȫ4"%ZU7@SؒK ,LqNCNhe"9E@6xG! [1uOg:\<|xǪW?U 9jyt}F]<]v̌}AKy01HЊ=ZhH wIv80 +դvR۬*z%nIъ:%ȉ{0ާ+tpi:NNPk۾mR.ؤ>WvT|{ 3QhD~g+rk9jTJ\M>,=$1W85`C2<{7lݻ{&ᬟbڤh=.mu&/u<͵' HԷ46/Tdx;zZsTQyϟj E&XE\a8g#TrSG4wՀhtvI }`?4%ԏײwϜG-}D[zb'\%O-Y3=jaٲlK)5s!Q Δs6jgΔ؃أپ&]FHB[tz\U-h&~xI:GOeiqx,8 Kr 󯯋>|ӡF7FRB1HvнWaO#U`8B<(E3(sOӢEQD2=A]Kއ3=R ROtPaKQ {nS))-{{t6_hZw7_m>S](^8ό 7Ԁ4Qt0:zglhp̳_t蘿"9֑e܀t!V2P[Ͷwy $!`S©5^k< ;,uZ[N:yЦz5C$$q\lOtvո>úl6?6`ӥ7I3eK홼/VX[ۀy[ ןoA"78";ȮY~*AR8@g56Q{.A Bї5g8ɰ^&fT'> pRBfMq Ie`ƥk&(SI xڜNB,`0[+8"Ќ]}xSvVmz֊џn9@4m`tHВ E[soP;So:!;'.j_64 5>>`;.1`hZ Npl"woI;rT,: n=ܒBhC/ cennʚ'εFP(gA!9I7h I+y!'\XB-('[OYPHi5(8sBXMe}4 f.% ~\D*A:VqL+T֨_ƔΘCZSVBo&@eKNv:Sw!L4i.pŒ [Zɉڃ-.$~/)tFf<\ɘǻp'chJuHD6t7QUAhl`@(IS*D[UC bY-xѳjbJd"58jY2crZID?_j ]$d u {׿!iޛl@T~GjY[FZ:e+ޢ$Hta\tzw3d?Spg}u 7(ׁܬymyol]MTf@o̗c5+Âټ3>Tl$lymz%p99,W@BTSYԼYe~X+7Qu'x7\/kѻb`c:pnmע͟!һ5a!D϶)In1O޹= ~-Bh&/OsP"V.X;\ic0?H9޹(<ZuISXW370l6|x~G-;ԲA?IHc!|<^d>(Gܑ`Qp" bZG9ği9&/:8T^V85]Aon]Yc"-B?$R}-2I}Dh"?i2gpII'*P)> `KY ?{8l/3v~${ؙۍ|A@/)9lKeFCYsEXգ"4Jrr>ՏN`$#G树1_~$AKtgR IX ]t2lOٺ ]'ȸ<܊'"{dBd?*rKJv ӥ7uJbV'ӦGF;Bl: 4=6.Zbt81.zC+zJu!*:o7eiw\- 4w kŋm5ˮQĔ{)/\@ &svG~7C@m~$drnk%?7y!;o#!u]\5+ӻe (Vbf`.ڙj/8{j>QEJE^Br ia^ ߮La>~ _91zO !='{XT}zX }8@wޜ \۽Ӧy=RB7vwX.!kSCňս[4Vc@F6bRRwz1y9QU|?\P`\7@oR7JNV#>Zv!DHwPf P*T*Q()xx>pki[$Fċ }Iq 8Xfzs``#<"1$L3ZKV+83TySJk1B ||j^GMxwC4Jg,_~j^[>oSzZwZ?3-Lnf3!0m^HM9Rd;OT[ъu0-2(qS5IFgz51ۼlRNӤ*|F_}eg[ fJsP~x3_h^'zt2qL<}?O;A,6Yfج\{zq{s .kW}/ih>Ic۾IIЅIͫ]\siV'їewD:މ98x[o Yv/OuLqK/f]㿌͎c<<ݼLfIT?oIntJrA\3ZɕKIy<;j3,\SXq;KI)<;j@,;9 c#qKH6F$ulDxk׷2`[тxҼx"GQ ֺ\y{v9qSWZ ZB.٭g5[stPQ1OBڽ~~~4jgT~.wM% :ׂ0z7zƵ4N} H!H:'YV!\#G\p F1NA28-D84x9w k~n) @:ؤ@v7ejjX& >@KI*zO5Cu#|~ S Z?y [&/7= ) 0]idmK1i #u-2NaVc6Lj|O3)`fOL펌R+{9 rKB4 7"+-~؍d{ƛ;:ux窨$c/jQ,uj 4~;Z5ժ_IftP=׵UP|iv,#&  ͼL{DfwO#?UwW1"G22V)5@Kmj~CR&lJ]VQ3N".?"lk!Q}d\|`?BUb0jGd|*(iz\x/1>r*^ !4LHC?{fl ^ K68Tu|ZQTt!ivyE v:ՐS~rrH0XVpʨ^ KM. *^:ߤS]5'qr}l596jM6PdRTFմ@8eϢOK <%(Ŕw) o]C?UANOrͤEv_?/t9Eg٨!}i` ( ڒՌvoiQa ;ua9j+><܉דVu2P¥ћ I\c9~0D1"3M0@vb*uFw[ hjE+jTLdJ'תRFer0* O@3#"V )EPԓ1O$/9tDa.&+r1UFUp_ nOɮh & DB=Ef Jey "ET>V>3f6퍀Y2ƜqzvD\9!pMO_)Kޠh2Vi7H&^ۍǟkKH>`J >]O,#9iM&Wy!Lٌ$*%g$*ӽ77Gҳq0g&9I%_b=0QcfP˕AWpZoFܜ| 7inX`X{atB3Rs]Z[nU~{V}pGp.E ^[^NBNVB4['t.-pVv򢻽UG]Xbي vtF[XvqIewtZAo!޸@ ǶMC/(W-`ոzez,Q=tgQJJmZk NEQk{N;Igƣ-Qy$E+tR:9RyҰqO$hcCa 3O. ӭ'3[6IAou9ڼ1yҜ8fޕ&\r[7jܡR7^ɐN/ Y-MC1*٤Rr W_ۅ*p4lҸ p^7616@++xXp\j!JI]fR q]vo --ZR4Pnh1.U7*YZuip NAN2-: b1$ :4t s ā9Izpj/b)<&5j6%$&-$FA9RƦٻ^Nq*,f:ʪhi P4 *Ph;غ#8gry7\.|Ҹwnѕ7@wem؟`߭h "uW[hOJSTGCsS,XS&S[U lT;X6 nSYZ:4;W#":E@2qe®L.]9ۓ  ȉ{%W md)#8OF_>ފ`Hıω53Jv? y6|xn gP.P: el_Xl}DԲt ,/A?=95=Z. f(1ZMŨ#.W|rj{9 rxnGAeq撉ZNKJz7k6>zYJ=ߕ(٨ ^_zo.֠ Z[L~uZ|u,{[k>À NLo-' c%` ޝT0icTޅsT@Sk,輚 AY!ripZ (ւ~eZxJ6r dUdʇF@?@GRpq}& i@E'"=Ór :6zB(yH (T8𐌸9~K`Àa#%t|RVB0oXÀ*8.S9]u%nb,ĩ%jӪ#襙n0f,sG#m |(/P]93/CQR®Uu"XAuuŭ-v6hݜ.%-($;)AL"HE9Piȴx@h+mcIB1!bz0{dž~ Vh$Ȣ=DVq)fUI]`dyNfdDdfD]J; ):Jl1~=E+$1kfڶ?(&$tv#y|ؽ?L5Xq ⷾM_u9DQ8!:ڲe!Gy&I?3sn}ff>FK h9&h pXh~:)qqMb'L阀5J4=&Cy[( Lmn;b>Ês`zx4F'-} A,6dݬb-7V(Z~Dظw/3^LfM9_`T1Ymi1Z#wTYbz8 v7Io9.YnQ6l̄5RjA4UtH Fň#$^H^vᓷ~ ֛YhfMPsE|B˶kU#RU#U 8p>6f,Z*Rx9S@z0e|*@ t\!fubn6w; xVwf?c-:A@yNΨݑANML$D5 TnW߷ږiؖfyߪ:}vՑiimvyS$neӝZӁzU5l\#IЊRbHqʴHez[PŊOʶjƗkYCKi]lm6n;▲CR:W+t *S [1%A bSԭz}" ud^b415Ϛ4Ahe*;6MATCۘ\c:vwuD^{TjfT j}kP;48Np`!x6+Al%g L+yg{9VsD&4 D=t0Uiv*<&x٫ @գŭ[5ܳ #5p{8pbI`ATxF'հOJ k26=_pTLsWaN Q 7Bхj w0܋$FMtnp=j}rUrtvO~)MXDEv_L.l9RIe,ѥ&O@T9s=fLytE8{xț<8%جG?eh=]E}<~X)ɦOM.~_#gfy\|[*jryQK假CO5x=\n~0L8kp>_a)lA 4: )YcB6()2tde5ĈD Fk_~]`@LHS u`"spo5ȵܘ/x53 Blp͖\ݵPF: r>,z˯fvfW.7njA%MY2 4)(|72̿o2U썐AWFzxpBO1;w쎍g:1ng[6B_[օwB&zsu}]7C\me^y' bc5\]N"ǔjד5;֍4ld+@֥ ^,5O)k9>ڄjxqg{'4 `&>OYLL'z' J42',{!I6 K.`u\KHd&r)],G}FZ.G&K}|fI+2#UuG2#2c՘| r}LGxe8Qq֕+ L%\qF]cUdYw>S;jHkf7Jrf6CIŢdA@VtWoUfFS;d#&}7a 2gVV&yc4hTzhBÍD8;G´#h*!i;w IDTgY'>3./$>@J8 |h!VUHd@ ]Jg $c͕<Jz&%/aۑp:_K #i('a}dH8>.C%wt=prpPB @!.~TKI*y:HNɘHNROCHk12OSZD rdπ9UFcΙN%r2b|2,ޯh'ɽcf'Yd&C |&R(7x$tu! ~5^!AD rY-+Ο'CxL~f欣mRg[lsho<-TӃLy/r}ȓl L(ѰHM)[)ek'!#gX%$֤ gfZE<\.dy:^txwWY$wsBQiԵR9#jn> jm7qk ~8;xΌ32P~];4+?)B,u|vr5kX*){b ?WM5W94sStT7_E^DE kkNs"4PFeN5;$0չ25l|U>֊YWU-}\a٣X:pjs]-GJ7oՎVV[Aն3]"B6Fu}-MY-ˊq* - L\ `pH'w+knBaK; ) ZE" z[кq2/QK‚wyּ z&GTQ:6Mm*w!p-E^.m=פqSmJ֍zY[tQQBƔ]P,H%iMmmXsLnWk v*}YRge ~<2`6fBrw$}4/u,38)\K5!1500ASD0>Nkw}A[ d( D)*gJc3ӥ[nU13t~G:QtmV=LNV7JòE~;{k2ͻp|\_ݗ&/ Y0}- dfSRi_kLɘ MR&qR#{x*.).㙸3rTƝm''ٞHnI/~BP.)Bmt~KH%KH9; qYi镯CCqs5s&G*}(!'ti 8|DD+e}r=6Ht631<  gMr0ዏRnp WK5ӽA-qhH:9D T AԶ+Z5dzDDšwLB23QYUB49"z枙?H"{cl[MW_b>iac 1LoHv 1z ҽ.'CSf(oCPuK7r6»xg33g"6yYlϡlDӃSqL9eXr >- 'Xp%$<#ɴ+k-^9.se* ]HsC1T*YvhD.;DGU:s(ߙG$@,0 q: tVB @wO!Y-$Y7 ; <;sⳋ+-떖/vttΛ3Mt$8o_'.Mx"3VZABm(m Rnr{tYm|m\M>V= Y SFQ`*ǭ'`Nn_Z[aBD')/nbbtJ݌+aviT2Y&OSVc4ަrרw\$K!mVvk5inqT;e]%:D0H8Ա|˭gܣR+Y x9WARC2bV|C`}!2 Nf#V!H:n'k:mw[GWnLX;[V5Mf6{BC¶UH2VY@%.RG3b-ES4Xb\ihQ8k26+"cz ;`2$/[QrKf0>\Z\ܻ*C /dunܙ]غ3rԥ󓊱BPPgZ۶)L?8fNg줹g^HRDyx_ԃÆ̇l_Z$`bwzw$I`Q?4;oXQfNi(8Qכ Lopj\DbmgOƪo760*6ɚg8KAԷ)*TgIa>'yn1x݉61ҳs>n0 u2ᕃmQf+`)',PV/ʛ!B6~\6m(cCJɭ7WRb0bd#hbafkdM\ Htcvo Y3ylG _Xyz OEF] ]yZO=JoK 'Z2俕&e ͜ )&!C8X̾O }:bF 2qti  iO&DJNJkg}1 :-qUՌZMVRq^ClRm$@ȋyxαNF:<kxLb8ө~D]ϝ's)|vo2%P} Y.=WEbhh8g cHOd$ 5щTV=0Ee(("xq c?~b%r.>)04$2@E݄tA0l&&C*1"%5ذ~O?'E:HU6 8" RR! 8$ FX>;cfR3`W|]Ĕ2pl3ߒl(Nc@>D;сx[N,8E*L zz RlvrEmie ձ w~Kšgڟ@M Wڎ AN TOZ1>b|ZI ($A2֜ e \iBΒ`Ht$C(8O$2LHv<'F?YʧlGN^cn֮7#?]GX?(Kw 鉉Jn"7#G>z'rT#L*b[xy*ـD^N5=gם0ʥ֏u΁"> Md8I2JN;/7H&׳lSb䆫h;.>*{/d k{T:6Meat}c4}8>{xqeFד?o~?}ɛ]wҶ͠)v|Ku?>I#rOtǺ8ՙ1ozWp}fQ论5ǯ{gE<,Vxc u67Gj-EY)Go_-! }٫o?<~Ť)uht?|ӓ?ߚ/yS'K.W{qgѥLJLO_p|ҘHQyG{+ tS{ITҳ"*Oo3=8zۗG;9,] -u0`YoXLGڞ̮_x+k#e*=r=>s oPsg⣋^u_pJbo|>SYW'P9ïCُutitЕses +gj&,kNLSUcwZժuOWқ^E[=:O/\O›9? /Dy30J[oo,:r]\ ~:N|T|woW"f^KL__nqGuy3eǃCc!7f~۵̺:"K.g}/rиw>i2C={;vL {*`#unnf rnFw덕l[ pefydмع@!aǹWv5x7*p>U_` »@6>3}:Jx2$IgCq>|ײ. * W!)F HQ,5p1@C MA疀ParurZlTbM%݈ͫjBk Q I[QqT+IRPi ͮ_ƀBb{v (~Msd4k&b=f 7m: @ -Nqj-{]H`ԩC6$ &aȞ/S IJA|_卐#0ӲOcȨ@egogi4P'd\f8H=Y\QpB8*CYU};u~ >,a1+@h„)i5}La`faէvBMJy!G ryI1/~alvUN`&څ:lOTrxڅl(@|ȵ]<>0N#d-~qxrd@ zTOdZ>}ZYWa(1,p$1 Q,0f*1L"IBBϺYײ/Hq[ܽDonbh@M,ͩ ҊjXFf#cP ґI&N2)nlbm[QXƞoJo'oQ J il7_fOdkq33ri`_A)'4uo֮|nXKX,2Ge5`P"Zņ"J qIbȓ1+O1/)s3d>QĀo6>n$#N|%T 6F]$;k?_pb{vF^L RskhHP$wr⸻Q!Ek{!jHkL*_ʕal<)܇.r!jc}/)fͭy[Q·&(I- /R}7DN4Txs_ o5K)3Q(W{Eo7j2(RT!q`[7(13$ܢ\+'LyF Co#ph `56 hA)J 1 HTo s`"eV Fke@TP@lbgq3e#Ux&ՈUBy.3kQ WD7Cʋq T_}nGwj_:I4рhJU 1 P#xt6{I1 >^T6x; SUXl"eȀ[ g0ސX&:h!DP4\~aiXհbUO] ⹦ק4fQ1iDVWm ) ɹպK`CZSGDl؞l=*d['[7>X%'hf)LPŘڲih3T]BU!5j%q K,1\.^WCp4bb/Tď2ꓗ f8F)K"ej(:@!NP ĥ% ^.';EK)(|ZD2 dh%88!C(Rn+T*"$˰ IZ1d/ g\m%WG "K%[ 9 HD8\"m@_ - ȥ%%{HcF=U%F]i<{XG} h77we%']@ j=_7'}RrIݳ ңqf UȚ^!"˾9maP^Z&a)!Ͳ:$6:*uD O&)nq? TqImbVv CpJmN/3s] ss9aLD%cm#"Ƶ HVU*`|ڽ5cam-hX/xBF 6 |̐U@6^Jp{o̍G}/ ?Z(Lۻ;_*FeM EUPr-aapJN!K]t46xJ#G٣`@:ԤU3d\6mkߣRx.p-& ?^(ؚe~3YMD JHE]=PH OIj} F[TFu6:L+#sn.&zǓ:Oȭgotӂ EsBgơvWg2[hȉu.,N*jV׋O.'Njv;.xmg=:OV Ho/L""(KGx Xv5=Lvckpu()X> Y9kA|:3 (PB^=rb 7KႡ@ %Ciw FLvZ'j{ H)j D ;BNE)4֥SkiՉ?UµTU. y*BoyKC3*/W9yGq؉Pi?sLOph˟~v J.j1H:p9\0/;wF{4ܛ\zS)E )!iXP!M@1"满!2Np#D;Bw}iljH `- JkЮ淐~S&#`:%S-DYin7dUR2Xc](` hKkQ!Bܼ= XA=Cf$ =0B_8hĐp>1B" U[Q<1w OeH !zNO~$B^:#)O01E@#tGǀTP0cJ"#BS,='D2CGP-0G !i1bgF0M"G< Pp/9 ]m`)0>|z.'&D#8j8탹XYh}y@r3㷘c9@c33UID X 2":$2Xb:czq_ vх4ӻ`9Rug'}ę ?tWrZv\f;*E~TԳ"m9gz`0҃'KKGN*8dT˸p yooS}w[ީޠ]r2?ye9(;:@eW?Bm!Qp5V";{Q 3# f6ͬ,@|jgjU^kEI%#?%%Ec6[c6o|0{p),O>jD݆"OϏ֊;)|Q#Hm١fFaPYf: ˂,{qp\P_6/%bnV«~[뎹g5H_6UKWAa@{u q.gNƙMU%Hxt M|!JgW\ 5uAEY k=EErʅn(27 NM:](#Gɠנr pq jX&bPQ/%# AŹAYpnM2{ afAS8y31&<7K˧ <[OvxvUo1ʀ8@SM>I7"O_k[6ž(#_ߴ~v))ol >.ҍn">T,񼻮0ٯ1U3uQvJ68ﯚl.ſ\^'[JQϋLn٧BbwiJocT]/Y}U]-糧XWd&heigYTq&x.oݓO\? AK)Y? [8c$W̺F+3ζdyq9dnew\? s>4/Ƃ^)T5H@@J{cT1quס ֧b$8#1la?Ht)¹\nϗG͵=8)dsɬgQ^8䅸 Kb/5ox\òQn&-?+1X3V ,Ahvw}-o`J5asdـO-6]h}8Ms`PJ+ò'e22?syسƟߓ}nנrgЭnvy ^nw E&e761)W"DkmͦUR!PhJ!ʏ2%.L%~7{egx`3^UtE>e@YᡚM i,jb@CMc]9q0&b;oPVIJUU3o= c\~2J vV euA"j"`xQEhHޏ/g_ Y}kqM(} gvF1BBjmqu$S:t2blwM' dȭBȲ\2C5ĒJ#3iOe~U2eV˶˾ICl!X]MB C XE_N*˾}Ve>Ś4V18-ʕJBt&RRCj#PC%-zĵ&TsGMK Wۛ>-H)aV90Ӻrn*y>g"]ka9oE6E|ise\ SVt:0g4˪`2Q,RL{T~Qx;՞wvS\[*]uit 3>/aY*E{hbu,_eoQ[P=V~?Y~ s!pZ&О1Q:WPPB _l[SU\u]ǝ8aWlm$SJnoocEnx),`cT!eO( t&wICNԨ'zݷ>=a2joP#KpVs7hXW +'%/dijpʪ'%0`9U'}X2gSw?Ux 9S=ALG}cB?tadGYk::6SGB6!5*hSi*mkak/`G~{yd6aɧ$uS)D7dѢBo8Hl!`м'm5BsKdYhj( K*Ǧ%: ; 03BlrTMuJ+WFx$e gd-Vk\s ggMVhvUnlL% GY8$6|!^ORDnT0h`PNg+Hve!;MلFNītP.O[JL49*XMf6cMU9TւHD|=zh)6fTĈ*%q X :$ R]A6F[ J޲b/1D+oeV*uDU*(G0{p`V M P$kf^,:- ֋k! [iAʿ:;@8poj4Dq!7n-oyI{yUu)qk4N\XE7n-Ae 1W[^7n=hRrec(&gեw0%fr6W@x'^SGz.C/jbxН:Y'ŝVGv:!:D^}4g)AAD7*@[('&Ϭ!6kg}!NGU^:B#ZշZ@J>h8RW6U7,݊Yl3aٸÿmaHykeXʿsrqV1@5f_D"8h[~ ٸjEQ_udKImLdl'fD@&wyu(g5;~D*UWg!N*JapBa "/l?w >Z<Eg4}ޒW,[<{;ڨGA>]4꒾L"\$t@bi$vZQvuL4R r0+J㤹"mʴ^fϟ1ja[^=e2l 6N`0S޸!ij0L2dC+"=*k#S*azΫk7Ԋt.,@3[:J<{~Vqc>jphFpDΞ&o n/L:SgPE8gYУ8V5ɑ8[̺ c =S &p Rkb>B1m-Dʎv 襙LpDE*U$WUQ01Jl"#" zNN)!+!40%s)NT.@5:{Ka S#GW7rbV}/J5tq3 jA+,,av0w'~H7Kj4 a 7a1_/0Yzf{ uExdR_eKДcӻ ÌаT9,C"zQRg$Ӓ֑fhw./\*, F\2vX*޽>, JC'ٸQn+gʵKz* K akBj5p/I0!vF9JHS!PQy(%b-A'jRro$B0 ,'Zb8 eZ@AooPY,'}`o ֠6DS?-_;Z}Q*\цr~U^"<<xfN|&,pjqKcY[5C_XNg"'T375QD.*CM#!8as8{mj *"~*ծ%`kD-8TjhRd7PӊZԌWAO+t^ "W:K*^SrDj%F@p.ˆ2n"`h)!qfEˤ2FcS^2l,S-DY#VݸG+&R2EiuXՊ[b.jx1SvV"ZԴ_+Zto0z롲O4wkMޟj͑a K;;3y?6@ wd3y7hr:cn狳\9k茳[CthU|Q!=b F8sPq}Ucd/a`bޣG#Fc]E 15w/&ȋɌ/}p *nby7}ۇ7;}`1ƏcO{O6̀d;%T*շ2O0'ҵX`> -[E>&~p7Yx^JԘډ-,S:lQ~ ӄ~HI?dz,saTeW>l,G.,kgVk ᏋvnB]yѲ$/. d*A#uwzCj[pUGGact8_9%ߒ {=>?\35miyӽC|Jˈbh(1ӽ#91sJ$;*/3,92}ClIb Yr]·E{;E) c.5ӜO2¨ƌV(!Ә9̴i8wnQ mounocX.P֡[ M(۽=@y^VDb+Ujhk ɺ #K”mۻsv`Pp }zIzSЖ5\PS )|X颼u4SI?]ӱc&nu?Ę,q6&10}ꫲcJOa64?rOkc),m ldgbXe1ٚTiTX٭]C@g&X\br䯆ׂG'`:,檪lh1፣Nq⌂jx)hiQSGGmAKxmzT;ɪj/SlŝP%4=5@ϝ4&{w @ͻLNr䯎;ւGw$ZS+`⪀7؊lZZUz :eҮkLO2aNe#u_<ڿ,[<Ay=Ct6EiK܇%q:.T`(&DIVSav *1g{R-єz0F$*赎F Kw (y#-@Z EŞI* # (̢ypј@Vbf4@ATDFү2V# ! bT),aV h`r5FBsE4gAGFQ`{نNEu'[ʱyKO16i3fsttPɢUC ]DǐrN&01Pɨ5cΥ>bty%wnleɒy??m%dӷ_.F$#4Z :;w7}ARIq7Z)M:];=WnnFZ,`RtU-ʩVnA5a 3;TJИ-)j5KmW晹 #i0zGqg*H5'rKF&j3("Ny&:d0B1AQYP˼@c*[hGI(A8e :'9X|Ю{: VTjXyP!P(:#>(`J*@-H݁'8&0=ޖs JLKTH4MQj12Ѳ5`BWrSgݖE(J$R="CQ4j;rxD^f"0Rh/ gG1O ńHŨ1YnP#11 `&;S+ZBmsA QlԠZIFvuiFlL>JG72,Zr/̠ի{;Q1\7]s+e=7JJ~ Q:  LO)Yr4=0,CqamvbcSr3:+h$Dbn 1ړ"n;xk7+r^m.<\˘ڦS #Zum>idmʮ򢓀>{YmD-õ'`?1$@Jtn_|շBtUZS8 z*tΣj"itNj"IWsΐ@CЮ8CM12u+<>{bhWVFv3 QrvιJH.;|Αp]<6& ԨҖ:HXWqBa)S*.tYbb9vGMj% uR5OS?{Wȑ Sdއ~؝ ;6̋D^%-4Imod"XU< 2d~q -h QA#6S׋!;}9,1f%dcFЏA,)(HI-8b"7[lA3s:5#m,6G}~(f[mg~n5 }JRrQ$chƳ~08F1kL{x( B#B5J%_364q9ɝgjf#&sH e.0G\I Di$pӥ;1 Cg_(G JFPc.y`p5w?Br8ڊ+,xr&k`yӻ< l<{w0Dv[%wyܸD0"zF"f<#] ]ޜfeݼpW!;d/ 3Mx- "ҋ#\l/o^}XenNa:{rZD)Bw7Kk, ]gl&/!xûޯf҃2voK~:Pw&+0;;&~>]CD}!BW Ȳp}nhD@Y̿8vuxb5;?w;7'F`zf*7y|'(TJvPǙb1V|7w]`;P0q!ƥ d~\vb :>>F2],x{1ccҝg9Aj2\Ў:|[ r=|xX&`i%2leh6O:"f u\W,)EIIo|+B$ 分!-)l)]f MEX eȽeK OĐ 凥۔3V_sQN gOR̻|T/aLg&'R-`d=RH7L. 4`7Gz R'%3%R(ItuYffg]&9Us<κa|lO.PC*9~aGPz}9ט*K(RP:"sʌէd,5~&ct'd'n1!>dQFkNnMN?"'5'T[^sroj)wEp"D}! Hnzt4G6@-|* >Bzy%v='5V\٪k:?Z~h>ҜvQnSjK R Jy;-詳X,6[-_vHug4[ Eۆ66ܙ)~P+r? GMg33E : b1K9,Vd뷖"9F}L9mCrS*gnu1HqߨῊM5'n[[h-3έF<1 u~Jt;gAy[cIѭ ]tçDOo5\f:鯏DywFChx*F1 ŷ7.BNU.*H$շ7[Rq=Z +J^KQ<'ƣA {'3JvS/F)*5=lTɽ)p]o\2|>_M7od&-gI{!K\a־_|5'Cp2(F| n/vOAg>w}r#}㳍Ӌ̴mєpjk\Zk+X1T:?גEkmCC\Ѹ!50Cʗ7Jz)ɑ*5]97|FcK'݆$|5/fq~ibYo_~T 8[[ÊF)MxŞ^S,K&0G_`%eBJ6r"(G?0g,qz;3Mŭ.g[`H[-lqt^("AR޶θ2]\}p#CqqV'= ٧C K yJy-nr :G=Q/r?Dhp̡SE -g5hBW)F[#mJy֬*~^_j/bfV&Bm R{<R]{GJ2my![1b4:^/LHxT0Ԇ| [c*7@z*;xEoѮW[I* [3x[)G])*'w~%)y'wm9vVn`h&0(`L(%gL .LDN2CȌ+(UUO帋$m9qq|>P7|ʤJZCPy:3(d yQH=)Br?7P0QZj]bߺ1l//Z;1y p _*F8B?@.HЪw?)98c_kK3<ь134#Nbէ7ƯP9EN/bSN<kv{2N`:4.+Oqlfsd N&9J̵ BLjxYga1%{M})dܜx됇<^,{!ZDe8( !7'(fZ_( 4{BYK-rY.-%˴qZVfB|>nKjvõ]PKLm/)LOcp/~^ z=$91LfѐhbVRY(uM㪋%-eV1LfOfha:]"4n29 ClL&WD+cqDt*?4VRZhʔ{X-uix |t߿J<-ҳ+d??SxnST$V:n_EW Fa3 3-޵bmٖ*V< b`3H0uM4"dߗ-[%[~UM~d,Vy Kߚb{ҊKe4PgE$mW>ZO.ٯpGS>T|nSsq,5}f]3l.Dd sfZٙ'?͙lV {W.KMcwu c u]3_7V-C\ `/)z 5/n 7^%_Id K'ggjAC'ͅ?uhڵ 懺c*h{򏒐s1i:>Lژ%. 3B9L+vNUe8f!evP-xu;5u{iFkRbv${v[˲uwDCŴ-XRtQ=QO;,+eUc,#$a,)-3,  Vs7;a3ss[{Ry1*(9ŸB F85˙&+9^kseǓ干6`dMVb;Ѵ99#0l4rbl b\L`XN(Hٳv T_D% dmtdt|f(BRU6l JphY @6bZR< K4M?g]JfSȳ^O]|?Lןë97P:x7IgN:ݟRZG@׵I*&K|8>&K6Q6 -7WG@E.̍gWF3k%Կ^2%.䀦 k!iJ [FPu<ۮζKZu<~I56b窷U7tRQ_$b3TnZZS vO[KjZYFhݴZ3'ZT7-Uj`뫹]7e&g_>9Oƽ! kmo߇LUrJQ]9PRR^郇n:kVufi~5/!HAU6)Q,2s_g xr׏ޜ# >tP_RmrTTv[ni@f%BNb*PKRdxȳ*Xx5Gʣ{k*\hRZp 5>)TQ>/Pp]fʊr/T( B&΂0N,7`e^U'8. +gKJU>JQ6@6Y8hkx:\WAO<ՠ;;;}vpMKV,]w>4AP@L H]g[ @_5NcȐd+s,ޜH$Yrd@s(j^w=qnVǀު˗lQHfm f4*Iw*7KplQ <597<!xeN8Frk-(>0̍βr!Lo?綛?VJEG-v2N~L.!c k64kdQTޒɻdɱs2$MV%ŧW6s*LM/Gsɰ6[7v@ T>a(@h%K̽Rk/%dZ8HF{LyiBGKpQ16i'}(\[ Iq4a 8-vrB9ώ%h5[*4B"^n nj N5S F }Xmaa3q%ۂAϑ)) ZDpruHd@KQB\Y2YH ?fW2;,I@J@,6#|@kur ʭ~i_d.N#B5ݱڢ]'["ܡ9/JZl. r6p*Y1FrœFiYe!zW3;b@3׹} Va&4C1qA@XV-6l\hX#(R9P_A )hxR6笫|˛rx~/+V 4IWZ+4o0xJl!fI ɖ*gͣrN(U僒 6VU9.Vrș\O4Wŷ/FfMt CH1HעI [>weGn@Hn[kBx2IWITj`A\ mm#U|ce+uVj4rI_\{%>aJ╆^TΡ0QcȨN*F+aS [|jP\FKc^\q>Prw'v,w9!FJ)i^e h]uCz@.断3\]c)#˥.(̨Vhv. +y>Z]W-Rni)KrZM7-.݃wZZ*K߉iiM5cWHqTnZx-y]-RQK3MJn6 u/Y:k* q);wr[{CKHbz@ЬĴ=eGk-$FGVMFaY9y~EoFXYF AKP(y=!~>ZC3eEv|L.N4ƳЎ!Q*aGk>O\H{cMʕXoI3~)@7xPV<h˔Q Ԃ7ף_PUU6~{L qt $:0Qg ʭ!P0ECxhlS 7!bt(GFߒ}?mR%T_LdfjCwt3=_9=cG7ϴ{5 zD#h9P->,[hO`V#AaK}ibRQ.tb%o#Z'έ{癜`1L KY~[P'wJ`\&/e]>Wtyv38WςNa癇̗W^kA2 Rq >TC`-!u)Tpc77n%iĭOnN{og'hXFXpv ǩ;.LDO_neW)8m4Y.DOɬ|ѕ"w>}Lϧ'tt2&k/oC{:SnYgMHP-2A3eȡyaI6sX+rULy;fӇß:l0+x &&}~OQ]OY_]x?0 Q IEt'[7b`j{YW#e:jDbN,M8|uo¹3KO'~[I|B'A-xK-*_mZG((iZLc^lkYCK Y heiܾSݱ7o,>-/͆. pfFrn sOHWU% [ѭ.j5CBbR'yr҃f.B9ڨ}rZDJC-=`S _t~Yч%U&9$ Fo Ѳ!N(^7~ykx1  ,}﯆z[+Ê;GW헋BMRoRdv_g%goMΧpQjN),鷺r@X!{|zuA1Jύg5' H_/NOiXxE\cx|@^:|,6b㒥 =.Y?E8q|(ŠV$-h !GkeaAEGY=B ix*g8..ۻmWzGOK [EQ[53lvE|)ۙ8N,YvvsP$ERB`$D& %_H f^BH,D()] O.5Mf:ٹubﵗ1C`drDK*l{maX2ᇍug'^j)Lu"0'5{| yYCug5ii)(=P |I{#2k\bC%<` iv׫߱ -}lW3 H$LQ}_:$vX_};į9'MYHJt+ <] ⼎+Pdx4%2ýx# HF}FmI,'V94䃫NQ4ºIN+LT3lې fA%  *Y\J3Ͳf A1[mxbNj6)D[ &ŇbCȨfJ6 Qcb >ǡ}g,=< Y BW1μyWU.F+Ct$v&5x O5kz %K:DM} XbGfkjmhؽ|#CY2Vp3^tKB;S-L/!V26dٵb%MݱgLT'(L=''ݒ3dajbH5y[9c7\7DpOP{h.B)tsXE({: RyyA$OE[PtmHݲ=y*p]Mr}.y6|^gY=<|J> >ce}tDBD7yᅏ6n}dP^Z8B$We_'G\WYvK45դj$**锑u2]U2b5p Ql{5+o@2B Y oL,?ާ<" {var/home/core/zuul-output/logs/kubelet.log0000644000000000000000003676455615147171560017732 0ustar rootrootFeb 24 00:05:34 crc systemd[1]: Starting Kubernetes Kubelet... Feb 24 00:05:34 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:34 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 00:05:35 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 24 00:05:36 crc kubenswrapper[4824]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.433853 4824 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.440920 4824 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.440961 4824 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.440973 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.440984 4824 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.440995 4824 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441004 4824 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441018 4824 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441030 4824 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441040 4824 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441051 4824 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441060 4824 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441069 4824 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441077 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441086 4824 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441095 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441106 4824 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441114 4824 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441123 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441132 4824 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441140 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441149 4824 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441158 4824 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441166 4824 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441175 4824 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441183 4824 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441191 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441200 4824 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441209 4824 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441218 4824 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441228 4824 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441236 4824 feature_gate.go:330] unrecognized feature gate: Example Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441245 4824 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441254 4824 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441262 4824 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441270 4824 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441278 4824 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441286 4824 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441295 4824 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441303 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441315 4824 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441326 4824 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441338 4824 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441349 4824 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441359 4824 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441369 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441378 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441389 4824 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441399 4824 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441409 4824 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441420 4824 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441431 4824 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441443 4824 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441454 4824 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441464 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441474 4824 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441485 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441495 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441506 4824 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441544 4824 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441555 4824 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441566 4824 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441583 4824 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441596 4824 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441608 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441618 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441630 4824 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441640 4824 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441657 4824 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441669 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441679 4824 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.441689 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.442967 4824 flags.go:64] FLAG: --address="0.0.0.0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443008 4824 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443025 4824 flags.go:64] FLAG: --anonymous-auth="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443068 4824 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443083 4824 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443095 4824 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443108 4824 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443121 4824 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443132 4824 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443141 4824 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443152 4824 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443162 4824 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443172 4824 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443182 4824 flags.go:64] FLAG: --cgroup-root="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443191 4824 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443201 4824 flags.go:64] FLAG: --client-ca-file="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443211 4824 flags.go:64] FLAG: --cloud-config="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443220 4824 flags.go:64] FLAG: --cloud-provider="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443229 4824 flags.go:64] FLAG: --cluster-dns="[]" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443244 4824 flags.go:64] FLAG: --cluster-domain="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443254 4824 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443265 4824 flags.go:64] FLAG: --config-dir="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443274 4824 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443284 4824 flags.go:64] FLAG: --container-log-max-files="5" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443296 4824 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443306 4824 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443316 4824 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443327 4824 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443336 4824 flags.go:64] FLAG: --contention-profiling="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443345 4824 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443356 4824 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443367 4824 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443378 4824 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443390 4824 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443400 4824 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443410 4824 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443420 4824 flags.go:64] FLAG: --enable-load-reader="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443429 4824 flags.go:64] FLAG: --enable-server="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443439 4824 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443451 4824 flags.go:64] FLAG: --event-burst="100" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443461 4824 flags.go:64] FLAG: --event-qps="50" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443472 4824 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443482 4824 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443491 4824 flags.go:64] FLAG: --eviction-hard="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443511 4824 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443546 4824 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443556 4824 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443567 4824 flags.go:64] FLAG: --eviction-soft="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443577 4824 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443586 4824 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443596 4824 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443605 4824 flags.go:64] FLAG: --experimental-mounter-path="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443615 4824 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443625 4824 flags.go:64] FLAG: --fail-swap-on="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443635 4824 flags.go:64] FLAG: --feature-gates="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443646 4824 flags.go:64] FLAG: --file-check-frequency="20s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443656 4824 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443666 4824 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443676 4824 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443686 4824 flags.go:64] FLAG: --healthz-port="10248" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443696 4824 flags.go:64] FLAG: --help="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443705 4824 flags.go:64] FLAG: --hostname-override="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443715 4824 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443725 4824 flags.go:64] FLAG: --http-check-frequency="20s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443734 4824 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443744 4824 flags.go:64] FLAG: --image-credential-provider-config="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443753 4824 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443763 4824 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443774 4824 flags.go:64] FLAG: --image-service-endpoint="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443784 4824 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443793 4824 flags.go:64] FLAG: --kube-api-burst="100" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443803 4824 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443813 4824 flags.go:64] FLAG: --kube-api-qps="50" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443822 4824 flags.go:64] FLAG: --kube-reserved="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443832 4824 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443842 4824 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443852 4824 flags.go:64] FLAG: --kubelet-cgroups="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443861 4824 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443871 4824 flags.go:64] FLAG: --lock-file="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443881 4824 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443890 4824 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443900 4824 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443914 4824 flags.go:64] FLAG: --log-json-split-stream="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443925 4824 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443934 4824 flags.go:64] FLAG: --log-text-split-stream="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443944 4824 flags.go:64] FLAG: --logging-format="text" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443953 4824 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443964 4824 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443974 4824 flags.go:64] FLAG: --manifest-url="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443983 4824 flags.go:64] FLAG: --manifest-url-header="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.443996 4824 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444006 4824 flags.go:64] FLAG: --max-open-files="1000000" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444017 4824 flags.go:64] FLAG: --max-pods="110" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444026 4824 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444036 4824 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444046 4824 flags.go:64] FLAG: --memory-manager-policy="None" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444055 4824 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444065 4824 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444075 4824 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444085 4824 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444107 4824 flags.go:64] FLAG: --node-status-max-images="50" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444117 4824 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444127 4824 flags.go:64] FLAG: --oom-score-adj="-999" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444137 4824 flags.go:64] FLAG: --pod-cidr="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444157 4824 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444171 4824 flags.go:64] FLAG: --pod-manifest-path="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444181 4824 flags.go:64] FLAG: --pod-max-pids="-1" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444191 4824 flags.go:64] FLAG: --pods-per-core="0" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444201 4824 flags.go:64] FLAG: --port="10250" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444212 4824 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444222 4824 flags.go:64] FLAG: --provider-id="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444232 4824 flags.go:64] FLAG: --qos-reserved="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444242 4824 flags.go:64] FLAG: --read-only-port="10255" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444252 4824 flags.go:64] FLAG: --register-node="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444261 4824 flags.go:64] FLAG: --register-schedulable="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444271 4824 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444286 4824 flags.go:64] FLAG: --registry-burst="10" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444296 4824 flags.go:64] FLAG: --registry-qps="5" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444306 4824 flags.go:64] FLAG: --reserved-cpus="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444315 4824 flags.go:64] FLAG: --reserved-memory="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444329 4824 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444339 4824 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444348 4824 flags.go:64] FLAG: --rotate-certificates="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444358 4824 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444368 4824 flags.go:64] FLAG: --runonce="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444377 4824 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444400 4824 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444411 4824 flags.go:64] FLAG: --seccomp-default="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444587 4824 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444598 4824 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444609 4824 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444619 4824 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444629 4824 flags.go:64] FLAG: --storage-driver-password="root" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444638 4824 flags.go:64] FLAG: --storage-driver-secure="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444648 4824 flags.go:64] FLAG: --storage-driver-table="stats" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444658 4824 flags.go:64] FLAG: --storage-driver-user="root" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444668 4824 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444678 4824 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444688 4824 flags.go:64] FLAG: --system-cgroups="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444697 4824 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444717 4824 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444726 4824 flags.go:64] FLAG: --tls-cert-file="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444738 4824 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444751 4824 flags.go:64] FLAG: --tls-min-version="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444761 4824 flags.go:64] FLAG: --tls-private-key-file="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444771 4824 flags.go:64] FLAG: --topology-manager-policy="none" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444781 4824 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444791 4824 flags.go:64] FLAG: --topology-manager-scope="container" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444801 4824 flags.go:64] FLAG: --v="2" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444814 4824 flags.go:64] FLAG: --version="false" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444827 4824 flags.go:64] FLAG: --vmodule="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444838 4824 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.444849 4824 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445134 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445148 4824 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445159 4824 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445172 4824 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445186 4824 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445199 4824 feature_gate.go:330] unrecognized feature gate: Example Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445211 4824 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445222 4824 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445234 4824 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445245 4824 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445255 4824 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445266 4824 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445280 4824 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445293 4824 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445305 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445318 4824 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445329 4824 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445341 4824 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445353 4824 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445365 4824 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445373 4824 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445382 4824 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445390 4824 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445402 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445410 4824 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445419 4824 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445427 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445438 4824 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445449 4824 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445459 4824 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445470 4824 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445478 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445487 4824 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445496 4824 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445505 4824 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445514 4824 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445577 4824 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445596 4824 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445608 4824 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445618 4824 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445627 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445636 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445645 4824 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445654 4824 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445663 4824 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445672 4824 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445680 4824 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445688 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445697 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445705 4824 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445714 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445722 4824 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445731 4824 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445739 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445748 4824 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445756 4824 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445765 4824 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445774 4824 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445782 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445794 4824 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445803 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445812 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445820 4824 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445829 4824 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445837 4824 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445846 4824 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445855 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445864 4824 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445873 4824 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445882 4824 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.445890 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.445921 4824 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.457695 4824 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.457766 4824 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457891 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457914 4824 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457924 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457937 4824 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457946 4824 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457954 4824 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457962 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457970 4824 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457978 4824 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457987 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.457995 4824 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458003 4824 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458011 4824 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458020 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458028 4824 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458035 4824 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458043 4824 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458054 4824 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458065 4824 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458074 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458083 4824 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458092 4824 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458103 4824 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458114 4824 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458125 4824 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458136 4824 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458150 4824 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458166 4824 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458176 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458188 4824 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458198 4824 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458208 4824 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458217 4824 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458224 4824 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458235 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458243 4824 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458252 4824 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458259 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458267 4824 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458274 4824 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458282 4824 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458289 4824 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458297 4824 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458336 4824 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458344 4824 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458353 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458363 4824 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458374 4824 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458383 4824 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458391 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458400 4824 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458407 4824 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458415 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458423 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458430 4824 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458438 4824 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458446 4824 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458453 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458461 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458469 4824 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458477 4824 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458488 4824 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458498 4824 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458506 4824 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458540 4824 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458549 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458556 4824 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458564 4824 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458571 4824 feature_gate.go:330] unrecognized feature gate: Example Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458579 4824 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458588 4824 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.458602 4824 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458824 4824 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458836 4824 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458845 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458853 4824 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458861 4824 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458869 4824 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458878 4824 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458887 4824 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458898 4824 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458908 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458919 4824 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458928 4824 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458937 4824 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458947 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458957 4824 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458967 4824 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458976 4824 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458986 4824 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.458994 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459003 4824 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459013 4824 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459022 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459032 4824 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459041 4824 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459073 4824 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459083 4824 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459092 4824 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459102 4824 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459111 4824 feature_gate.go:330] unrecognized feature gate: Example Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459122 4824 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459132 4824 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459141 4824 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459151 4824 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459162 4824 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459189 4824 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459204 4824 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459215 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459224 4824 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459234 4824 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459244 4824 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459254 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459265 4824 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459275 4824 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459287 4824 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459300 4824 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459311 4824 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459321 4824 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459334 4824 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459345 4824 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459355 4824 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459368 4824 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459378 4824 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459387 4824 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459397 4824 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459407 4824 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459417 4824 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459426 4824 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459436 4824 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459446 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459456 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459465 4824 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459476 4824 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459486 4824 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459495 4824 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459505 4824 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459548 4824 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459559 4824 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459570 4824 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459580 4824 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459588 4824 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.459599 4824 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.459612 4824 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.460890 4824 server.go:940] "Client rotation is on, will bootstrap in background" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.466704 4824 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.466902 4824 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.469092 4824 server.go:997] "Starting client certificate rotation" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.469153 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.469424 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-04 10:35:59.668298275 +0000 UTC Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.469545 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.498486 4824 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.503248 4824 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.505296 4824 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.520607 4824 log.go:25] "Validated CRI v1 runtime API" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.553907 4824 log.go:25] "Validated CRI v1 image API" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.556358 4824 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.561735 4824 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-24-00-01-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.561784 4824 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.588839 4824 manager.go:217] Machine: {Timestamp:2026-02-24 00:05:36.585554392 +0000 UTC m=+0.575178901 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d5e3d68d-d538-4dbe-b3fe-7347ab36b29a BootID:7ea41d01-04ab-44da-af10-993e94777268 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5d:37:c8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5d:37:c8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b4:f4:28 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fe:4d:17 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d8:a0:3d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:64:f9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:6d:28:fa:f8:c1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0a:cf:a3:57:88:fb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.589227 4824 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.589468 4824 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.591070 4824 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.591448 4824 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.591673 4824 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.592085 4824 topology_manager.go:138] "Creating topology manager with none policy" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.592112 4824 container_manager_linux.go:303] "Creating device plugin manager" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.592927 4824 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.592990 4824 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.593338 4824 state_mem.go:36] "Initialized new in-memory state store" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.593553 4824 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.601814 4824 kubelet.go:418] "Attempting to sync node with API server" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.601935 4824 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.602011 4824 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.602050 4824 kubelet.go:324] "Adding apiserver pod source" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.602082 4824 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.610668 4824 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.611158 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.611270 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.611245 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.611349 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.611860 4824 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.613249 4824 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615207 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615241 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615252 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615262 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615279 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615290 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615299 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615316 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615329 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615340 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615356 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.615367 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.616635 4824 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.617288 4824 server.go:1280] "Started kubelet" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.617782 4824 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.617934 4824 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.618588 4824 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.619266 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:36 crc systemd[1]: Started Kubernetes Kubelet. Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.619686 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.620031 4824 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.620655 4824 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.620841 4824 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.620649 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:26:58.16827453 +0000 UTC Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.621072 4824 server.go:460] "Adding debug handlers to kubelet server" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.621372 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.621421 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.621384 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.621564 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.620908 4824 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.622191 4824 factory.go:55] Registering systemd factory Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.622218 4824 factory.go:221] Registration of the systemd container factory successfully Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.623428 4824 factory.go:153] Registering CRI-O factory Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.623472 4824 factory.go:221] Registration of the crio container factory successfully Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.623730 4824 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.623819 4824 factory.go:103] Registering Raw factory Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.623841 4824 manager.go:1196] Started watching for new ooms in manager Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.626768 4824 manager.go:319] Starting recovery of all containers Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.636297 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647587 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647700 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647719 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647732 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647751 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647765 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647780 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647793 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647811 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647827 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647842 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647856 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647874 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647892 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647905 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647924 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647936 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647950 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647965 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647978 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.647991 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648011 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648024 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648038 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648055 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648070 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648084 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648134 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648153 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648167 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648181 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648199 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648212 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648226 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648239 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648253 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648271 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648286 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648298 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648313 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648327 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648339 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648353 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648392 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648408 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648424 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648444 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648460 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648476 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648490 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648503 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648540 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648561 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648575 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648590 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648606 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648618 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648630 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648643 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648659 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648673 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648686 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648713 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648725 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648738 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648750 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648764 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648778 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648792 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648809 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648825 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648837 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648849 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648861 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648873 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648890 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648904 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648931 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648947 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648960 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648977 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.648988 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649001 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649013 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649025 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649040 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649054 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649069 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649081 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649095 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649108 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649120 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649134 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649151 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649168 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649181 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649193 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649206 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649218 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649230 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649241 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649254 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649267 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649343 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649364 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649379 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649392 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649403 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649423 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649436 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649451 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649465 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649481 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649495 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649509 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649542 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649557 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649568 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649581 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649594 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649609 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649619 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649630 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649638 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649648 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649657 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649668 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649680 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649693 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649710 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649730 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649743 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649755 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649767 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649781 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649793 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649809 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649819 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649833 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649845 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649857 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649868 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649889 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649902 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649924 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649936 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649948 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649961 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649975 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.649987 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650002 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650015 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650025 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650038 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650049 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650063 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650074 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650087 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650099 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650112 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650126 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650138 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650150 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650164 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650176 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650189 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650201 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650213 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650230 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650243 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650255 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650266 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650277 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650288 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650307 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650320 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650334 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650348 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650361 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650373 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650384 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650400 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650411 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650497 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650557 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650571 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650583 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650593 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650605 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650616 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650631 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650643 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650655 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650673 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650684 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650699 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650713 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650728 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650749 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650762 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650773 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650787 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650798 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650812 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650825 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650842 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.650861 4824 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.652956 4824 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.652982 4824 reconstruct.go:97] "Volume reconstruction finished" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.652990 4824 reconciler.go:26] "Reconciler: start to sync state" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.655927 4824 manager.go:324] Recovery completed Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.665502 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.666892 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.666935 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.666948 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.667965 4824 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.667983 4824 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.668003 4824 state_mem.go:36] "Initialized new in-memory state store" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.689598 4824 policy_none.go:49] "None policy: Start" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.690498 4824 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.691234 4824 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.691272 4824 state_mem.go:35] "Initializing new in-memory state store" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.692412 4824 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.692480 4824 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.692527 4824 kubelet.go:2335] "Starting kubelet main sync loop" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.692706 4824 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 24 00:05:36 crc kubenswrapper[4824]: W0224 00:05:36.693346 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.693391 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.722039 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.741558 4824 manager.go:334] "Starting Device Plugin manager" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.741617 4824 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.741633 4824 server.go:79] "Starting device plugin registration server" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.742193 4824 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.742210 4824 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.742723 4824 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.742840 4824 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.742849 4824 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.751655 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.793488 4824 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.793917 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.795429 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.795486 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.795501 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.795715 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.796619 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.796696 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.796713 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.796702 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.796721 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.797093 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.797234 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.797269 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.797968 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.797994 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798001 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798021 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798011 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798148 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798398 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798455 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798467 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798478 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798486 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.798494 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800270 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800309 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800324 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800373 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800388 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800401 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.800902 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.801532 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.801578 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802186 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802219 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802475 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802548 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802560 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802852 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.802871 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.803096 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.805206 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.805282 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.805335 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.823052 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.842990 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.844225 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.844263 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.844273 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.844300 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:36 crc kubenswrapper[4824]: E0224 00:05:36.844913 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856014 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856048 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856067 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856084 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856101 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856117 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856132 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856147 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856162 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856178 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856222 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856260 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856286 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856329 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.856360 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957065 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957101 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957127 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957143 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957157 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957170 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957185 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957206 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957220 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957235 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957250 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957267 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957280 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957293 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957306 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957320 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957740 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957800 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957828 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957859 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957867 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957881 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957893 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957903 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957909 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957924 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957931 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957948 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957953 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:36 crc kubenswrapper[4824]: I0224 00:05:36.957970 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.045319 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.047271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.047457 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.047552 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.047638 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:37 crc kubenswrapper[4824]: E0224 00:05:37.048225 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.127020 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.133577 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.148865 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.171838 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.173003 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-87ac35aea0d265e4bd6e078ffe3fba67fca63055d00970d942679bd5ceeb8229 WatchSource:0}: Error finding container 87ac35aea0d265e4bd6e078ffe3fba67fca63055d00970d942679bd5ceeb8229: Status 404 returned error can't find the container with id 87ac35aea0d265e4bd6e078ffe3fba67fca63055d00970d942679bd5ceeb8229 Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.174315 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-101334578a872cced90036ea87b59ea7c239365376bb0b884d50dfc8c3e78821 WatchSource:0}: Error finding container 101334578a872cced90036ea87b59ea7c239365376bb0b884d50dfc8c3e78821: Status 404 returned error can't find the container with id 101334578a872cced90036ea87b59ea7c239365376bb0b884d50dfc8c3e78821 Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.175925 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.178399 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-855ced46711932315ddb66ea27f3b3c52c001f44f0187aa2812aa58bac0aaeb5 WatchSource:0}: Error finding container 855ced46711932315ddb66ea27f3b3c52c001f44f0187aa2812aa58bac0aaeb5: Status 404 returned error can't find the container with id 855ced46711932315ddb66ea27f3b3c52c001f44f0187aa2812aa58bac0aaeb5 Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.189055 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bbabcc68576398a5364c21ce47548bedd20424867b6729e5ddf255c8171ab1b3 WatchSource:0}: Error finding container bbabcc68576398a5364c21ce47548bedd20424867b6729e5ddf255c8171ab1b3: Status 404 returned error can't find the container with id bbabcc68576398a5364c21ce47548bedd20424867b6729e5ddf255c8171ab1b3 Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.197480 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a92f116098ea86db8d92f15dd3578c2fe61f9b3c6fcd3ce7716ceed9c2911a11 WatchSource:0}: Error finding container a92f116098ea86db8d92f15dd3578c2fe61f9b3c6fcd3ce7716ceed9c2911a11: Status 404 returned error can't find the container with id a92f116098ea86db8d92f15dd3578c2fe61f9b3c6fcd3ce7716ceed9c2911a11 Feb 24 00:05:37 crc kubenswrapper[4824]: E0224 00:05:37.223980 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.448580 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.450786 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.450857 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.450869 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.450901 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:37 crc kubenswrapper[4824]: E0224 00:05:37.451670 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 24 00:05:37 crc kubenswrapper[4824]: W0224 00:05:37.571121 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:37 crc kubenswrapper[4824]: E0224 00:05:37.571246 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.621268 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:48:49.420678712 +0000 UTC Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.621668 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.696503 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a92f116098ea86db8d92f15dd3578c2fe61f9b3c6fcd3ce7716ceed9c2911a11"} Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.697724 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bbabcc68576398a5364c21ce47548bedd20424867b6729e5ddf255c8171ab1b3"} Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.699460 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"855ced46711932315ddb66ea27f3b3c52c001f44f0187aa2812aa58bac0aaeb5"} Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.700278 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"101334578a872cced90036ea87b59ea7c239365376bb0b884d50dfc8c3e78821"} Feb 24 00:05:37 crc kubenswrapper[4824]: I0224 00:05:37.701230 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"87ac35aea0d265e4bd6e078ffe3fba67fca63055d00970d942679bd5ceeb8229"} Feb 24 00:05:38 crc kubenswrapper[4824]: W0224 00:05:38.001165 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.001734 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.025805 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Feb 24 00:05:38 crc kubenswrapper[4824]: W0224 00:05:38.049605 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.049696 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:38 crc kubenswrapper[4824]: W0224 00:05:38.230292 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.230398 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.252724 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.254242 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.254284 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.254299 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.254362 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.254882 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.549812 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:05:38 crc kubenswrapper[4824]: E0224 00:05:38.551443 4824 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.621094 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.623246 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 01:43:37.45521017 +0000 UTC Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.705934 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a" exitCode=0 Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.706011 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.706140 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.707290 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.707331 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.707346 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.709159 4824 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f" exitCode=0 Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.709229 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.709265 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.709303 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710339 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710369 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710380 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710390 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710430 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.710453 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.711281 4824 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3" exitCode=0 Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.711441 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.711742 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.713181 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.713214 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.713230 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.716642 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.716680 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.716691 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.716702 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.716778 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.718184 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.718243 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.718271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.719121 4824 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2" exitCode=0 Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.719153 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2"} Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.719287 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.720376 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.720432 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:38 crc kubenswrapper[4824]: I0224 00:05:38.720459 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: E0224 00:05:39.250005 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.621534 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.624001 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 13:00:08.678846863 +0000 UTC Feb 24 00:05:39 crc kubenswrapper[4824]: E0224 00:05:39.627160 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.727185 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.727319 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.727337 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.727353 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.731221 4824 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60" exitCode=0 Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.731409 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.731409 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.732417 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.732456 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.732470 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.734054 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.734040 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.735045 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.735141 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.735170 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.737827 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.737894 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.737919 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.737945 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b"} Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.737841 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739015 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739045 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739059 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739270 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739309 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.739349 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.855414 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.857261 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.857313 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.857333 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:39 crc kubenswrapper[4824]: I0224 00:05:39.857369 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:39 crc kubenswrapper[4824]: E0224 00:05:39.857939 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 24 00:05:40 crc kubenswrapper[4824]: W0224 00:05:40.323066 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:40 crc kubenswrapper[4824]: E0224 00:05:40.323389 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:40 crc kubenswrapper[4824]: W0224 00:05:40.444028 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:40 crc kubenswrapper[4824]: E0224 00:05:40.444200 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.521130 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.621106 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.624399 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:22:03.243102605 +0000 UTC Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.742008 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.747777 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e58d6fa1f448ca33cfbeb0873a4f6698f83f676348dda39a279ad793fce7ced3" exitCode=255 Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.747888 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e58d6fa1f448ca33cfbeb0873a4f6698f83f676348dda39a279ad793fce7ced3"} Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.747921 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.749325 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.749383 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.749404 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.750136 4824 scope.go:117] "RemoveContainer" containerID="e58d6fa1f448ca33cfbeb0873a4f6698f83f676348dda39a279ad793fce7ced3" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751486 4824 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c" exitCode=0 Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751586 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c"} Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751662 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751734 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751776 4824 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751856 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.751786 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.752622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.752668 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.752685 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753409 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753452 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753425 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753470 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753543 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753558 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753571 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753588 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:40 crc kubenswrapper[4824]: I0224 00:05:40.753604 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.376744 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.624746 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:42:33.450466584 +0000 UTC Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.759427 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451"} Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.759499 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6"} Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.759554 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe"} Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.759578 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f"} Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.761586 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.763581 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486"} Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.763749 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.763810 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.764712 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.764741 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:41 crc kubenswrapper[4824]: I0224 00:05:41.764752 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.625561 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 23:13:58.191883148 +0000 UTC Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.733621 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.777173 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.777198 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d"} Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.777277 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.777179 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778459 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778502 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778532 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778616 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778678 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:42 crc kubenswrapper[4824]: I0224 00:05:42.778728 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.058313 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.060263 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.060321 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.060335 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.060373 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.521616 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.521823 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.626013 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:35:33.602937106 +0000 UTC Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.779857 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.779927 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781336 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781373 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781384 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781467 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781577 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:43 crc kubenswrapper[4824]: I0224 00:05:43.781609 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.171240 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.626677 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 06:43:10.373928468 +0000 UTC Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.680081 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.680343 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.682035 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.682077 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.682090 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.778828 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.782634 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.782645 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784464 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784462 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784507 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784558 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784570 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:44 crc kubenswrapper[4824]: I0224 00:05:44.784580 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.015800 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.035135 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.035431 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.037200 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.037264 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.037284 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.627903 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 16:18:19.614453227 +0000 UTC Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.786363 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.787323 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.787353 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:45 crc kubenswrapper[4824]: I0224 00:05:45.787362 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.533555 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.533778 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.534964 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.534990 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.535000 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:46 crc kubenswrapper[4824]: I0224 00:05:46.628405 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 00:15:00.714139094 +0000 UTC Feb 24 00:05:46 crc kubenswrapper[4824]: E0224 00:05:46.751875 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:05:47 crc kubenswrapper[4824]: I0224 00:05:47.629331 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:41:07.822292395 +0000 UTC Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.031057 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.031315 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.033120 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.033182 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.033198 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.037983 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.630112 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 11:27:46.974315946 +0000 UTC Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.794084 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.795348 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.795394 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.795410 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:48 crc kubenswrapper[4824]: I0224 00:05:48.798803 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:05:49 crc kubenswrapper[4824]: I0224 00:05:49.631082 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:34:14.553860028 +0000 UTC Feb 24 00:05:49 crc kubenswrapper[4824]: I0224 00:05:49.796446 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:49 crc kubenswrapper[4824]: I0224 00:05:49.798119 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:49 crc kubenswrapper[4824]: I0224 00:05:49.798162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:49 crc kubenswrapper[4824]: I0224 00:05:49.798171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:50 crc kubenswrapper[4824]: I0224 00:05:50.357437 4824 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 24 00:05:50 crc kubenswrapper[4824]: I0224 00:05:50.357584 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 24 00:05:50 crc kubenswrapper[4824]: I0224 00:05:50.632076 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:24:13.826914982 +0000 UTC Feb 24 00:05:50 crc kubenswrapper[4824]: W0224 00:05:50.728143 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 00:05:50 crc kubenswrapper[4824]: I0224 00:05:50.728544 4824 trace.go:236] Trace[1574236571]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 00:05:40.727) (total time: 10001ms): Feb 24 00:05:50 crc kubenswrapper[4824]: Trace[1574236571]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (00:05:50.728) Feb 24 00:05:50 crc kubenswrapper[4824]: Trace[1574236571]: [10.00126959s] [10.00126959s] END Feb 24 00:05:50 crc kubenswrapper[4824]: E0224 00:05:50.728691 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 00:05:50 crc kubenswrapper[4824]: W0224 00:05:50.841466 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 00:05:50 crc kubenswrapper[4824]: I0224 00:05:50.841851 4824 trace.go:236] Trace[179095445]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 00:05:40.839) (total time: 10001ms): Feb 24 00:05:50 crc kubenswrapper[4824]: Trace[179095445]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:05:50.841) Feb 24 00:05:50 crc kubenswrapper[4824]: Trace[179095445]: [10.001958436s] [10.001958436s] END Feb 24 00:05:50 crc kubenswrapper[4824]: E0224 00:05:50.842000 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.338860 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.340854 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:05:51 crc kubenswrapper[4824]: W0224 00:05:51.344332 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.344502 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:51 crc kubenswrapper[4824]: W0224 00:05:51.345803 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.345855 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.354341 4824 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.354923 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.356008 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.359404 4824 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.359692 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.363431 4824 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.363560 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.394170 4824 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]log ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]etcd ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/generic-apiserver-start-informers ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/priority-and-fairness-filter ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-apiextensions-informers ok Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/crd-informer-synced failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-system-namespaces-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/bootstrap-controller failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/start-kube-aggregator-informers ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 24 00:05:51 crc kubenswrapper[4824]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]autoregister-completion ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/apiservice-openapi-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 24 00:05:51 crc kubenswrapper[4824]: livez check failed Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.394259 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.629233 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:51Z is after 2026-02-23T05:33:13Z Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.634020 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:53:19.560532553 +0000 UTC Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.803077 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.803512 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.805179 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" exitCode=255 Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.805223 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486"} Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.805272 4824 scope.go:117] "RemoveContainer" containerID="e58d6fa1f448ca33cfbeb0873a4f6698f83f676348dda39a279ad793fce7ced3" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.805414 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.806573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.806598 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.806609 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:51 crc kubenswrapper[4824]: I0224 00:05:51.807077 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:05:51 crc kubenswrapper[4824]: E0224 00:05:51.807320 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:05:52 crc kubenswrapper[4824]: I0224 00:05:52.623355 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:52Z is after 2026-02-23T05:33:13Z Feb 24 00:05:52 crc kubenswrapper[4824]: I0224 00:05:52.634634 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 01:26:58.995006326 +0000 UTC Feb 24 00:05:52 crc kubenswrapper[4824]: I0224 00:05:52.810118 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 00:05:53 crc kubenswrapper[4824]: I0224 00:05:53.523076 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:05:53 crc kubenswrapper[4824]: I0224 00:05:53.523210 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 00:05:53 crc kubenswrapper[4824]: I0224 00:05:53.626977 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:53Z is after 2026-02-23T05:33:13Z Feb 24 00:05:53 crc kubenswrapper[4824]: I0224 00:05:53.635226 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 09:27:15.479684268 +0000 UTC Feb 24 00:05:54 crc kubenswrapper[4824]: I0224 00:05:54.624745 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:54Z is after 2026-02-23T05:33:13Z Feb 24 00:05:54 crc kubenswrapper[4824]: I0224 00:05:54.635969 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:50:11.856617984 +0000 UTC Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.048094 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.048348 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.049741 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.049792 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.049806 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.061893 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 24 00:05:55 crc kubenswrapper[4824]: W0224 00:05:55.347720 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:55Z is after 2026-02-23T05:33:13Z Feb 24 00:05:55 crc kubenswrapper[4824]: E0224 00:05:55.347860 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.623232 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:55Z is after 2026-02-23T05:33:13Z Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.636576 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 16:40:10.504136442 +0000 UTC Feb 24 00:05:55 crc kubenswrapper[4824]: W0224 00:05:55.727108 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:55Z is after 2026-02-23T05:33:13Z Feb 24 00:05:55 crc kubenswrapper[4824]: E0224 00:05:55.727249 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.820119 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.821347 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.821425 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:55 crc kubenswrapper[4824]: I0224 00:05:55.821450 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.383419 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.383744 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.385620 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.385667 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.385679 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.386372 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:05:56 crc kubenswrapper[4824]: E0224 00:05:56.386616 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.392451 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.624962 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:56Z is after 2026-02-23T05:33:13Z Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.637326 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 03:30:41.489689839 +0000 UTC Feb 24 00:05:56 crc kubenswrapper[4824]: E0224 00:05:56.752171 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.822096 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.824128 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.824193 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.824214 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:56 crc kubenswrapper[4824]: I0224 00:05:56.825071 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:05:56 crc kubenswrapper[4824]: E0224 00:05:56.825379 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.571887 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.623288 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:57Z is after 2026-02-23T05:33:13Z Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.637728 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 01:33:16.178920795 +0000 UTC Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.740959 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.742926 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.743000 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.743028 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.743112 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:05:57 crc kubenswrapper[4824]: E0224 00:05:57.747045 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:57Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:05:57 crc kubenswrapper[4824]: E0224 00:05:57.750392 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:57Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.825655 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.827129 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.827198 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.827214 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:05:57 crc kubenswrapper[4824]: I0224 00:05:57.828058 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:05:57 crc kubenswrapper[4824]: E0224 00:05:57.828296 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:05:58 crc kubenswrapper[4824]: I0224 00:05:58.626434 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:58Z is after 2026-02-23T05:33:13Z Feb 24 00:05:58 crc kubenswrapper[4824]: I0224 00:05:58.638903 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 12:23:59.352083662 +0000 UTC Feb 24 00:05:59 crc kubenswrapper[4824]: W0224 00:05:59.302545 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:59Z is after 2026-02-23T05:33:13Z Feb 24 00:05:59 crc kubenswrapper[4824]: E0224 00:05:59.302652 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:05:59 crc kubenswrapper[4824]: I0224 00:05:59.624579 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:05:59Z is after 2026-02-23T05:33:13Z Feb 24 00:05:59 crc kubenswrapper[4824]: I0224 00:05:59.639877 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 13:32:26.675301132 +0000 UTC Feb 24 00:06:00 crc kubenswrapper[4824]: I0224 00:06:00.007955 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:06:00 crc kubenswrapper[4824]: E0224 00:06:00.012221 4824 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:00 crc kubenswrapper[4824]: I0224 00:06:00.626239 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:00Z is after 2026-02-23T05:33:13Z Feb 24 00:06:00 crc kubenswrapper[4824]: I0224 00:06:00.641014 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:16:09.556710406 +0000 UTC Feb 24 00:06:01 crc kubenswrapper[4824]: E0224 00:06:01.360848 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:01Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:06:01 crc kubenswrapper[4824]: I0224 00:06:01.626327 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:01Z is after 2026-02-23T05:33:13Z Feb 24 00:06:01 crc kubenswrapper[4824]: I0224 00:06:01.641802 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:37:42.838045738 +0000 UTC Feb 24 00:06:01 crc kubenswrapper[4824]: W0224 00:06:01.754690 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:01Z is after 2026-02-23T05:33:13Z Feb 24 00:06:01 crc kubenswrapper[4824]: E0224 00:06:01.754810 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:02 crc kubenswrapper[4824]: I0224 00:06:02.625025 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:02Z is after 2026-02-23T05:33:13Z Feb 24 00:06:02 crc kubenswrapper[4824]: I0224 00:06:02.642439 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:04:33.143647884 +0000 UTC Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.521364 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.521552 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.521654 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.521878 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.523569 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.523637 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.523656 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.524497 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.524845 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165" gracePeriod=30 Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.625289 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:03Z is after 2026-02-23T05:33:13Z Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.642657 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:35:56.839771428 +0000 UTC Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.846855 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.847356 4824 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165" exitCode=255 Feb 24 00:06:03 crc kubenswrapper[4824]: I0224 00:06:03.847415 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165"} Feb 24 00:06:04 crc kubenswrapper[4824]: W0224 00:06:04.023002 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:04Z is after 2026-02-23T05:33:13Z Feb 24 00:06:04 crc kubenswrapper[4824]: E0224 00:06:04.023541 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.624759 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:04Z is after 2026-02-23T05:33:13Z Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.643283 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:50:53.34762439 +0000 UTC Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.751056 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:04 crc kubenswrapper[4824]: E0224 00:06:04.752703 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:04Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.753461 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.753548 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.753572 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.753618 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:04 crc kubenswrapper[4824]: E0224 00:06:04.757989 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:04Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.854349 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.854928 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba"} Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.855144 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.856576 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.856622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:04 crc kubenswrapper[4824]: I0224 00:06:04.856635 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.625643 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:05Z is after 2026-02-23T05:33:13Z Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.644207 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 02:42:58.458552667 +0000 UTC Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.856880 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.857804 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.857832 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:05 crc kubenswrapper[4824]: I0224 00:06:05.857843 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:06 crc kubenswrapper[4824]: I0224 00:06:06.626138 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:06Z is after 2026-02-23T05:33:13Z Feb 24 00:06:06 crc kubenswrapper[4824]: I0224 00:06:06.645242 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 00:19:48.502240005 +0000 UTC Feb 24 00:06:06 crc kubenswrapper[4824]: E0224 00:06:06.752285 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:07 crc kubenswrapper[4824]: W0224 00:06:07.058133 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:07Z is after 2026-02-23T05:33:13Z Feb 24 00:06:07 crc kubenswrapper[4824]: E0224 00:06:07.058225 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:07 crc kubenswrapper[4824]: I0224 00:06:07.623910 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:07Z is after 2026-02-23T05:33:13Z Feb 24 00:06:07 crc kubenswrapper[4824]: I0224 00:06:07.646385 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 05:05:12.383249084 +0000 UTC Feb 24 00:06:08 crc kubenswrapper[4824]: I0224 00:06:08.624169 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:08Z is after 2026-02-23T05:33:13Z Feb 24 00:06:08 crc kubenswrapper[4824]: I0224 00:06:08.647481 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 20:13:44.520690878 +0000 UTC Feb 24 00:06:09 crc kubenswrapper[4824]: I0224 00:06:09.626678 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:09Z is after 2026-02-23T05:33:13Z Feb 24 00:06:09 crc kubenswrapper[4824]: I0224 00:06:09.648099 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:21:10.623568405 +0000 UTC Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.522383 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.522689 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.525026 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.525290 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.525457 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.624573 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:10Z is after 2026-02-23T05:33:13Z Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.648720 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:04:19.766824538 +0000 UTC Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.693592 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.695388 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.695577 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.695596 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:10 crc kubenswrapper[4824]: I0224 00:06:10.696414 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:06:11 crc kubenswrapper[4824]: E0224 00:06:11.366756 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:11Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.623856 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:11Z is after 2026-02-23T05:33:13Z Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.649303 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:54:27.099766513 +0000 UTC Feb 24 00:06:11 crc kubenswrapper[4824]: E0224 00:06:11.757559 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:11Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.758368 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.759418 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.759461 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.759473 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.759504 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:11 crc kubenswrapper[4824]: E0224 00:06:11.762802 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:11Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.875090 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.875820 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.877815 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" exitCode=255 Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.877900 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46"} Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.877969 4824 scope.go:117] "RemoveContainer" containerID="becc3a99f880114ef1a12f111c21365920ead17ffb3ba93936683ea411b50486" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.878253 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.879362 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.879414 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.879430 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:11 crc kubenswrapper[4824]: I0224 00:06:11.880275 4824 scope.go:117] "RemoveContainer" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" Feb 24 00:06:11 crc kubenswrapper[4824]: E0224 00:06:11.881344 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:12 crc kubenswrapper[4824]: I0224 00:06:12.625448 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:12Z is after 2026-02-23T05:33:13Z Feb 24 00:06:12 crc kubenswrapper[4824]: I0224 00:06:12.649898 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:54:55.494167556 +0000 UTC Feb 24 00:06:12 crc kubenswrapper[4824]: I0224 00:06:12.884200 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 00:06:13 crc kubenswrapper[4824]: I0224 00:06:13.522592 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:06:13 crc kubenswrapper[4824]: I0224 00:06:13.522742 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 00:06:13 crc kubenswrapper[4824]: I0224 00:06:13.625733 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:13Z is after 2026-02-23T05:33:13Z Feb 24 00:06:13 crc kubenswrapper[4824]: I0224 00:06:13.649993 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 20:16:53.33940236 +0000 UTC Feb 24 00:06:14 crc kubenswrapper[4824]: W0224 00:06:14.603328 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:14Z is after 2026-02-23T05:33:13Z Feb 24 00:06:14 crc kubenswrapper[4824]: E0224 00:06:14.603448 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.624072 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:14Z is after 2026-02-23T05:33:13Z Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.650337 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:49:39.919722023 +0000 UTC Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.680822 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.681122 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.682942 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.683000 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:14 crc kubenswrapper[4824]: I0224 00:06:14.683009 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:15 crc kubenswrapper[4824]: I0224 00:06:15.625946 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:15Z is after 2026-02-23T05:33:13Z Feb 24 00:06:15 crc kubenswrapper[4824]: I0224 00:06:15.651274 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 01:48:09.262294545 +0000 UTC Feb 24 00:06:16 crc kubenswrapper[4824]: I0224 00:06:16.625769 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:16Z is after 2026-02-23T05:33:13Z Feb 24 00:06:16 crc kubenswrapper[4824]: I0224 00:06:16.652114 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:07:43.728990135 +0000 UTC Feb 24 00:06:16 crc kubenswrapper[4824]: E0224 00:06:16.752635 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:16 crc kubenswrapper[4824]: I0224 00:06:16.927775 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:06:16 crc kubenswrapper[4824]: E0224 00:06:16.934427 4824 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:16 crc kubenswrapper[4824]: E0224 00:06:16.935722 4824 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.571427 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.571650 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.573174 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.573272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.573302 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.574250 4824 scope.go:117] "RemoveContainer" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" Feb 24 00:06:17 crc kubenswrapper[4824]: E0224 00:06:17.574617 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.623791 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:17Z is after 2026-02-23T05:33:13Z Feb 24 00:06:17 crc kubenswrapper[4824]: I0224 00:06:17.653163 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 04:15:27.337103489 +0000 UTC Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.623836 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:18Z is after 2026-02-23T05:33:13Z Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.653437 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 23:35:08.959263457 +0000 UTC Feb 24 00:06:18 crc kubenswrapper[4824]: E0224 00:06:18.760492 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:18Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.763792 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.765565 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.765645 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.765674 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:18 crc kubenswrapper[4824]: I0224 00:06:18.765723 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:18 crc kubenswrapper[4824]: E0224 00:06:18.768322 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:18Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:06:18 crc kubenswrapper[4824]: W0224 00:06:18.914981 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:18Z is after 2026-02-23T05:33:13Z Feb 24 00:06:18 crc kubenswrapper[4824]: E0224 00:06:18.915120 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:19 crc kubenswrapper[4824]: I0224 00:06:19.626280 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:19Z is after 2026-02-23T05:33:13Z Feb 24 00:06:19 crc kubenswrapper[4824]: I0224 00:06:19.653900 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:34:56.182997188 +0000 UTC Feb 24 00:06:20 crc kubenswrapper[4824]: W0224 00:06:20.020411 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:20Z is after 2026-02-23T05:33:13Z Feb 24 00:06:20 crc kubenswrapper[4824]: E0224 00:06:20.020586 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.357586 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.357943 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.359856 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.359934 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.359955 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.360790 4824 scope.go:117] "RemoveContainer" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" Feb 24 00:06:20 crc kubenswrapper[4824]: E0224 00:06:20.361032 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.626621 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:20Z is after 2026-02-23T05:33:13Z Feb 24 00:06:20 crc kubenswrapper[4824]: I0224 00:06:20.655171 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 11:25:32.459802537 +0000 UTC Feb 24 00:06:21 crc kubenswrapper[4824]: E0224 00:06:21.372573 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:06:21 crc kubenswrapper[4824]: I0224 00:06:21.624796 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:21Z is after 2026-02-23T05:33:13Z Feb 24 00:06:21 crc kubenswrapper[4824]: I0224 00:06:21.655412 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:30:23.519255167 +0000 UTC Feb 24 00:06:22 crc kubenswrapper[4824]: I0224 00:06:22.626374 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:22Z is after 2026-02-23T05:33:13Z Feb 24 00:06:22 crc kubenswrapper[4824]: I0224 00:06:22.656044 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 20:07:20.262902446 +0000 UTC Feb 24 00:06:23 crc kubenswrapper[4824]: I0224 00:06:23.521774 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:06:23 crc kubenswrapper[4824]: I0224 00:06:23.521901 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 00:06:23 crc kubenswrapper[4824]: I0224 00:06:23.625988 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:23Z is after 2026-02-23T05:33:13Z Feb 24 00:06:23 crc kubenswrapper[4824]: I0224 00:06:23.656429 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:49:16.844178692 +0000 UTC Feb 24 00:06:24 crc kubenswrapper[4824]: I0224 00:06:24.624633 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:24Z is after 2026-02-23T05:33:13Z Feb 24 00:06:24 crc kubenswrapper[4824]: I0224 00:06:24.656717 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 03:38:49.301233849 +0000 UTC Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.042837 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.043057 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.044667 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.044705 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.044716 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:25 crc kubenswrapper[4824]: W0224 00:06:25.046959 4824 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:25Z is after 2026-02-23T05:33:13Z Feb 24 00:06:25 crc kubenswrapper[4824]: E0224 00:06:25.047059 4824 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.625648 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:25Z is after 2026-02-23T05:33:13Z Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.657629 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:18:32.420175452 +0000 UTC Feb 24 00:06:25 crc kubenswrapper[4824]: E0224 00:06:25.767212 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:25Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.768601 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.770434 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.770501 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.770556 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:25 crc kubenswrapper[4824]: I0224 00:06:25.770611 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:25 crc kubenswrapper[4824]: E0224 00:06:25.776169 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:25Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:06:26 crc kubenswrapper[4824]: I0224 00:06:26.625623 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:26Z is after 2026-02-23T05:33:13Z Feb 24 00:06:26 crc kubenswrapper[4824]: I0224 00:06:26.658339 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:44:08.16625045 +0000 UTC Feb 24 00:06:26 crc kubenswrapper[4824]: E0224 00:06:26.752807 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:27 crc kubenswrapper[4824]: I0224 00:06:27.624423 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:27Z is after 2026-02-23T05:33:13Z Feb 24 00:06:27 crc kubenswrapper[4824]: I0224 00:06:27.658725 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:15:20.729260485 +0000 UTC Feb 24 00:06:28 crc kubenswrapper[4824]: I0224 00:06:28.626657 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:28Z is after 2026-02-23T05:33:13Z Feb 24 00:06:28 crc kubenswrapper[4824]: I0224 00:06:28.659576 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:29:16.388457681 +0000 UTC Feb 24 00:06:29 crc kubenswrapper[4824]: I0224 00:06:29.625349 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:29Z is after 2026-02-23T05:33:13Z Feb 24 00:06:29 crc kubenswrapper[4824]: I0224 00:06:29.660640 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:51:25.431108001 +0000 UTC Feb 24 00:06:30 crc kubenswrapper[4824]: I0224 00:06:30.626209 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:30Z is after 2026-02-23T05:33:13Z Feb 24 00:06:30 crc kubenswrapper[4824]: I0224 00:06:30.660843 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:07:06.745187889 +0000 UTC Feb 24 00:06:31 crc kubenswrapper[4824]: E0224 00:06:31.379065 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189705f6f4979d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,LastTimestamp:2026-02-24 00:05:36.61725009 +0000 UTC m=+0.606874579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:06:31 crc kubenswrapper[4824]: I0224 00:06:31.624697 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:31Z is after 2026-02-23T05:33:13Z Feb 24 00:06:31 crc kubenswrapper[4824]: I0224 00:06:31.661599 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 21:51:41.262964702 +0000 UTC Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.625368 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:32Z is after 2026-02-23T05:33:13Z Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.662045 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 05:30:52.4427183 +0000 UTC Feb 24 00:06:32 crc kubenswrapper[4824]: E0224 00:06:32.773081 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:32Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.776317 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.777963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.778002 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.778016 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:32 crc kubenswrapper[4824]: I0224 00:06:32.778048 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:32 crc kubenswrapper[4824]: E0224 00:06:32.781762 4824 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:32Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.522643 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.522796 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.522904 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.523140 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.524903 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.524967 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.524992 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.525868 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.526098 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba" gracePeriod=30 Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.626774 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.663030 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:37:49.088062821 +0000 UTC Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.957587 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.959113 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.959887 4824 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba" exitCode=255 Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.959950 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba"} Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.960002 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632"} Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.960035 4824 scope.go:117] "RemoveContainer" containerID="ef63a3a20052bbda09997002dbbce1fd4cdf577f00711857db86b460ed4e8165" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.960288 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.962646 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.962699 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:33 crc kubenswrapper[4824]: I0224 00:06:33.962713 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.623733 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:34Z is after 2026-02-23T05:33:13Z Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.663583 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 19:09:55.200913841 +0000 UTC Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.680861 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.967329 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.969406 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.970829 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.970904 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:34 crc kubenswrapper[4824]: I0224 00:06:34.970933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.625158 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:35Z is after 2026-02-23T05:33:13Z Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.664778 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 15:13:17.166117538 +0000 UTC Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.693306 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.695227 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.695277 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.695291 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.696092 4824 scope.go:117] "RemoveContainer" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.975408 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.977051 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a"} Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.977242 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.978154 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.978189 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:35 crc kubenswrapper[4824]: I0224 00:06:35.978198 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.623928 4824 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:36Z is after 2026-02-23T05:33:13Z Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.665588 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 20:55:11.83130633 +0000 UTC Feb 24 00:06:36 crc kubenswrapper[4824]: E0224 00:06:36.753307 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.984987 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.985830 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.988725 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" exitCode=255 Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.988773 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a"} Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.988820 4824 scope.go:117] "RemoveContainer" containerID="1f1c1f049b88250d60dc213ff2c7023d6f8204c87e8e78cff43d73af450d7e46" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.989097 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.990778 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.990871 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.990963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:36 crc kubenswrapper[4824]: I0224 00:06:36.991638 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:06:36 crc kubenswrapper[4824]: E0224 00:06:36.991858 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.456865 4824 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.571741 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.667672 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 00:19:47.09999595 +0000 UTC Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.993686 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.997372 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.999163 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.999247 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:37 crc kubenswrapper[4824]: I0224 00:06:37.999271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:38 crc kubenswrapper[4824]: I0224 00:06:38.000482 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:06:38 crc kubenswrapper[4824]: E0224 00:06:38.001047 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:38 crc kubenswrapper[4824]: I0224 00:06:38.668207 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 07:49:56.39286261 +0000 UTC Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.668982 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:51:26.874709425 +0000 UTC Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.781855 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.783424 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.783858 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.784067 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.784383 4824 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.794088 4824 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.794559 4824 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.794591 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.800212 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.800573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.800818 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.801009 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.801177 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:39Z","lastTransitionTime":"2026-02-24T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.820329 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.833263 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.833697 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.833919 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.834122 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.834312 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:39Z","lastTransitionTime":"2026-02-24T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.850832 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.859870 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.859913 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.859927 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.859952 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.859968 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:39Z","lastTransitionTime":"2026-02-24T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.873546 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.886178 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.886209 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.886225 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.886244 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:39 crc kubenswrapper[4824]: I0224 00:06:39.886261 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:39Z","lastTransitionTime":"2026-02-24T00:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.899000 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.899628 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:06:39 crc kubenswrapper[4824]: E0224 00:06:39.899836 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.000058 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.101110 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.202617 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.303461 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.357650 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.358145 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.359770 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.359852 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.359872 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.361250 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.361632 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.404309 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.504790 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.521299 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.521592 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.523395 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.523464 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.523484 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.605077 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: I0224 00:06:40.669440 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:31:20.358464057 +0000 UTC Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.705445 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.805901 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:40 crc kubenswrapper[4824]: E0224 00:06:40.906317 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.006618 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.107970 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.208092 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.309261 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.410271 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.510724 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.611590 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: I0224 00:06:41.674686 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 07:44:32.202341001 +0000 UTC Feb 24 00:06:41 crc kubenswrapper[4824]: I0224 00:06:41.692959 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:41 crc kubenswrapper[4824]: I0224 00:06:41.694471 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:41 crc kubenswrapper[4824]: I0224 00:06:41.694538 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:41 crc kubenswrapper[4824]: I0224 00:06:41.694550 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.711951 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.812811 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:41 crc kubenswrapper[4824]: E0224 00:06:41.913429 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.013547 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.113970 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.214660 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.315498 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.416046 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.516444 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.616893 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: I0224 00:06:42.675891 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 13:23:50.034027917 +0000 UTC Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.717774 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.818667 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:42 crc kubenswrapper[4824]: I0224 00:06:42.859125 4824 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 00:06:42 crc kubenswrapper[4824]: E0224 00:06:42.918859 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.019260 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.119567 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.220267 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.320462 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.421323 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.522251 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: I0224 00:06:43.522292 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 00:06:43 crc kubenswrapper[4824]: I0224 00:06:43.522354 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.622353 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: I0224 00:06:43.676658 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:27:37.873862517 +0000 UTC Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.722669 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.822918 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:43 crc kubenswrapper[4824]: E0224 00:06:43.923736 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.024141 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.124860 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.225207 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.325440 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.426458 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.527191 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.627828 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: I0224 00:06:44.676825 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 17:35:28.969885904 +0000 UTC Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.728191 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.828733 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:44 crc kubenswrapper[4824]: E0224 00:06:44.929625 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.029811 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.130728 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.231589 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.332660 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.433666 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.534768 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.634898 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: I0224 00:06:45.677410 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:57:14.559341208 +0000 UTC Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.735217 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.835887 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:45 crc kubenswrapper[4824]: E0224 00:06:45.936803 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.037108 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.138021 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.238336 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.339986 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.440736 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.541665 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.642885 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: I0224 00:06:46.678257 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:42:05.214927626 +0000 UTC Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.744122 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.754274 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.845004 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:46 crc kubenswrapper[4824]: E0224 00:06:46.945794 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.046448 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.147141 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.247789 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.348117 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.448822 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.550037 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.650800 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: I0224 00:06:47.679207 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:30:39.517843055 +0000 UTC Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.751057 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.852010 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:47 crc kubenswrapper[4824]: E0224 00:06:47.952711 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.054003 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.154965 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.255765 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.356658 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.457677 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.558603 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.659447 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: I0224 00:06:48.679827 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 13:28:43.858003213 +0000 UTC Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.760535 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.861485 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:48 crc kubenswrapper[4824]: I0224 00:06:48.937453 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 00:06:48 crc kubenswrapper[4824]: I0224 00:06:48.950036 4824 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 00:06:48 crc kubenswrapper[4824]: E0224 00:06:48.962385 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.063113 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.164009 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.264229 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.365343 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.466334 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.566500 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.667324 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.673856 4824 csr.go:261] certificate signing request csr-s9jhh is approved, waiting to be issued Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.681974 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:04:51.106511388 +0000 UTC Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.716343 4824 csr.go:257] certificate signing request csr-s9jhh is issued Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.769748 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.870913 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.965298 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.969392 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.969433 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.969447 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.969482 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.969494 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:49Z","lastTransitionTime":"2026-02-24T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.979498 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.987750 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.987798 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.987810 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.987830 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:49 crc kubenswrapper[4824]: I0224 00:06:49.987842 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:49Z","lastTransitionTime":"2026-02-24T00:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:49 crc kubenswrapper[4824]: E0224 00:06:49.998318 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.001742 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.001788 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.001801 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.001820 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.001841 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:50Z","lastTransitionTime":"2026-02-24T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.011801 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.017377 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.017415 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.017428 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.017449 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.017461 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:06:50Z","lastTransitionTime":"2026-02-24T00:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.041336 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.041473 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.041500 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.142734 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.243305 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.343989 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.444884 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.537600 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.537803 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.539055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.539089 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.539100 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.544733 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.545872 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.646318 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.682755 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 00:50:38.041379468 +0000 UTC Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.718152 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 00:01:49 +0000 UTC, rotation deadline is 2026-11-10 02:17:58.057624157 +0000 UTC Feb 24 00:06:50 crc kubenswrapper[4824]: I0224 00:06:50.718212 4824 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6218h11m7.339415298s for next certificate rotation Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.746703 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.847584 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:50 crc kubenswrapper[4824]: E0224 00:06:50.948124 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: I0224 00:06:51.035556 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:51 crc kubenswrapper[4824]: I0224 00:06:51.036681 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:51 crc kubenswrapper[4824]: I0224 00:06:51.036723 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:51 crc kubenswrapper[4824]: I0224 00:06:51.036734 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.048422 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.149512 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.252672 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.353358 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.454320 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.554670 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.654916 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: I0224 00:06:51.683316 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:14:38.933680681 +0000 UTC Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.755104 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.855978 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:51 crc kubenswrapper[4824]: E0224 00:06:51.956828 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.057676 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.158775 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.259410 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.360299 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.461240 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.562333 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.663369 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: I0224 00:06:52.684043 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:09:35.253055869 +0000 UTC Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.763914 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.864669 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:52 crc kubenswrapper[4824]: E0224 00:06:52.965309 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.065972 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.166702 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.267755 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.368059 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.468585 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.569689 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.670537 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: I0224 00:06:53.684802 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 04:09:00.561762235 +0000 UTC Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.771563 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.872564 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:53 crc kubenswrapper[4824]: E0224 00:06:53.973229 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.074274 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.174716 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.274984 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.375690 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.476366 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.577393 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.678506 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: I0224 00:06:54.685431 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:03:38.360293656 +0000 UTC Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.778628 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.879208 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:54 crc kubenswrapper[4824]: E0224 00:06:54.980216 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.080976 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.182129 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.282621 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.383321 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.484220 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.584742 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.685326 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.686453 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 20:57:31.036983447 +0000 UTC Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.693820 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.695078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.695126 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.695139 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:55 crc kubenswrapper[4824]: I0224 00:06:55.695828 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.696013 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.786144 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.886643 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:55 crc kubenswrapper[4824]: E0224 00:06:55.987677 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.088119 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.188956 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.289560 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.390417 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.472090 4824 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.491065 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.591609 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.687212 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 17:30:25.447875698 +0000 UTC Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.692370 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.693874 4824 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.695233 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.695272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:06:56 crc kubenswrapper[4824]: I0224 00:06:56.695283 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.755302 4824 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.793478 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.894140 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:56 crc kubenswrapper[4824]: E0224 00:06:56.995202 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.095638 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.196660 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.297795 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.398384 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.499416 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.600579 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: I0224 00:06:57.688018 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:39:31.220267436 +0000 UTC Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.701460 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.802282 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:57 crc kubenswrapper[4824]: E0224 00:06:57.903078 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.003236 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.103506 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.204575 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.305331 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.406057 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.506716 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.607807 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: I0224 00:06:58.688904 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 12:24:50.72989597 +0000 UTC Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.708544 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.809623 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:58 crc kubenswrapper[4824]: E0224 00:06:58.910128 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.010261 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.110922 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.211746 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.312115 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.412784 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.513453 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.614469 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: I0224 00:06:59.689935 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 05:37:17.236355843 +0000 UTC Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.714990 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.815966 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:06:59 crc kubenswrapper[4824]: E0224 00:06:59.916959 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.017594 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.118556 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.219296 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.255731 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.260757 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.260828 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.260851 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.260878 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.260899 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.274766 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.279643 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.279720 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.279750 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.279778 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.279797 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.299596 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.304796 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.304834 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.304845 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.305104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.305136 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.320243 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.325325 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.325378 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.325398 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.325423 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.325440 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.339888 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.340030 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.340056 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.440783 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.541490 4824 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.638754 4824 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.644064 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.644148 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.644162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.644181 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.644195 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.651371 4824 apiserver.go:52] "Watching apiserver" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.657950 4824 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658203 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658574 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658732 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658754 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658784 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.658932 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.658884 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.659296 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.659450 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.659569 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.661267 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.661381 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.661577 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.661973 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.662071 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.662331 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.662399 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.662580 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.663510 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.690112 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 08:04:39.151490291 +0000 UTC Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.692763 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.708732 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.723187 4824 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.723313 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.739662 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.747414 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.747487 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.747506 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.747578 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.747609 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.752725 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.764340 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.774244 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808126 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808200 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808240 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808877 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808915 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808668 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808948 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808743 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.808981 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809049 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809077 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809094 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809114 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809668 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809242 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809677 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809545 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809739 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809551 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809730 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809807 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809664 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809831 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809860 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809881 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809901 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809921 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809941 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.809988 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810007 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810031 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810050 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810068 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810092 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810114 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810133 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810151 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810170 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810190 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810210 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810228 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810250 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810267 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810261 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810376 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810284 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810467 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810546 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810596 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810631 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810667 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810702 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810738 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810778 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810814 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810847 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810877 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810909 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810940 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810973 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811011 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811048 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811085 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811127 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811166 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811203 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811240 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811271 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811297 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811329 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810351 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811367 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811410 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811445 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811479 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811513 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811568 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811600 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811631 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811665 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811697 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811730 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811764 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811794 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811825 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811858 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811973 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812009 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812041 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812070 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812130 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812163 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812203 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812235 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812267 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812304 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812339 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812370 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812403 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812433 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812464 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812494 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812554 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812588 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812626 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812655 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812689 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812720 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812751 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812782 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812813 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812847 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812890 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812926 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812958 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812993 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813026 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813059 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813092 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813128 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813170 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813203 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813232 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813265 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813308 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813338 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813367 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813401 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813435 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813466 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813497 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813552 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813587 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813614 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813642 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813671 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813703 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813733 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813762 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813795 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813831 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813867 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813957 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814020 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814058 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814091 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814126 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814156 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814194 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814233 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814268 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814304 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814342 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814380 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814413 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814450 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814489 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814564 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814609 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814645 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814680 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814715 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814750 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814788 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814825 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814859 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814892 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814928 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814961 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814998 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815033 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815067 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815098 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815127 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815156 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815308 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815351 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815385 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815420 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815454 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815493 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815556 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815588 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815617 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815651 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815690 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815727 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815760 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815793 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815825 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815860 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815894 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815937 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815974 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816009 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816042 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816079 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816114 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816152 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816194 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816234 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816265 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816307 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816343 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816376 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816405 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816487 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816559 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816604 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816646 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816694 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816733 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816781 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816816 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816860 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816904 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816943 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816980 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817017 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817056 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810274 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810633 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.810705 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811004 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811096 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811127 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811337 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811417 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811510 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811509 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811636 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811708 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.811935 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812176 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.812188 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813137 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813254 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.813853 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814293 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814344 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814646 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814768 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814800 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814940 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.814959 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815001 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815479 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815606 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815742 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.815916 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816107 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816113 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816125 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816322 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816676 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.816765 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817055 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817569 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817634 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.817757 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818066 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818089 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818145 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818209 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818257 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818942 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.818975 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819034 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819444 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819571 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819590 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819627 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819879 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.819904 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820190 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820637 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820780 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820883 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820952 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.820914 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.821213 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.821324 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.821739 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.821890 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.822336 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.822553 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.822666 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:01.322630049 +0000 UTC m=+85.312254538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.822984 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823003 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823121 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823210 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823568 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823622 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823673 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.823816 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.824143 4824 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.824850 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.825374 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.825430 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.825629 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.825737 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.825856 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:01.325834651 +0000 UTC m=+85.315459130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.825936 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.826163 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.826636 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.828362 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.828413 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.828763 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.828987 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.829288 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.829697 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830202 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830298 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.830401 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:01.330370648 +0000 UTC m=+85.319995117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830592 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830656 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830726 4824 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830835 4824 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830861 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.830923 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831073 4824 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831103 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831123 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831139 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831153 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831167 4824 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831189 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831472 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.831567 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.832132 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.832245 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.832443 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.832697 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.832986 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.833204 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.833781 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.835845 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.837304 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.837674 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.838416 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.839070 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.839114 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.839207 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.839665 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.839798 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.840036 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.840106 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.840119 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.840607 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.840886 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.841320 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.841443 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.841450 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.841582 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.841334 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.842101 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.842132 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.842100 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.842153 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.842244 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:01.342219383 +0000 UTC m=+85.331844062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.842420 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.842513 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.846157 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.848338 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.849064 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.849700 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.849953 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.849982 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.849999 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:00 crc kubenswrapper[4824]: E0224 00:07:00.850092 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:01.350069755 +0000 UTC m=+85.339694244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.851245 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.851287 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.851302 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.851324 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.851343 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.856798 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.857035 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.857055 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.857380 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.857981 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.858118 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.858131 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.858973 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859201 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859203 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859290 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859377 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859387 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859589 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859620 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859693 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859694 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859762 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.859819 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.860218 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.860358 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.861365 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.861550 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.861904 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.861983 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862026 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862029 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.861911 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862284 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862318 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862466 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862838 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.862897 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.863275 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.863325 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.863366 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.863417 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864126 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864183 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864371 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864206 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864260 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864636 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864793 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.864897 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.865641 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.865903 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.865913 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866009 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866053 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866164 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866193 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866214 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866252 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866266 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866305 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866515 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.866739 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.867236 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.867692 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.867919 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.868807 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.877809 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.884276 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.886546 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.901288 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931740 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931813 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931898 4824 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931927 4824 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931946 4824 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931962 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931975 4824 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.931988 4824 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932001 4824 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932013 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932027 4824 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932040 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932053 4824 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932064 4824 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932077 4824 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932089 4824 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932102 4824 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932114 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932126 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932139 4824 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932133 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932206 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932153 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932253 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932266 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932280 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932294 4824 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932307 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932320 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932333 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932349 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932362 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932374 4824 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932386 4824 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932398 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932410 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932425 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932439 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932452 4824 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932466 4824 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932478 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932489 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932502 4824 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932555 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932570 4824 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932584 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932596 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932609 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932623 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932634 4824 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932647 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932659 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932671 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932683 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932695 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932707 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932719 4824 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932731 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932744 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932756 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932768 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932779 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932791 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932805 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932816 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932829 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932843 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932856 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932868 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932880 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932892 4824 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932904 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932915 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932927 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932939 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932955 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932966 4824 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932978 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.932989 4824 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933001 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933016 4824 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933029 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933040 4824 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933052 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933063 4824 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933076 4824 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933087 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933100 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933112 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933123 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933135 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933147 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933158 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933169 4824 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933181 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933193 4824 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933204 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933215 4824 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933228 4824 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933240 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933252 4824 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933265 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933278 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933291 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933304 4824 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933315 4824 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933327 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933340 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933351 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933363 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933435 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933465 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933480 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933501 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933513 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933554 4824 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933570 4824 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933583 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933597 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933614 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933628 4824 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933641 4824 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933654 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933668 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933685 4824 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933704 4824 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933717 4824 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933732 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933751 4824 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933768 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933785 4824 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933798 4824 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933810 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933824 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933838 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933852 4824 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933865 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933879 4824 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933891 4824 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933903 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933919 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933932 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933945 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933958 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933970 4824 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933981 4824 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.933993 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934004 4824 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934016 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934029 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934040 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934051 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934072 4824 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934084 4824 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934096 4824 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934107 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934119 4824 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934136 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934148 4824 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934159 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934171 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934215 4824 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934229 4824 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934244 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934256 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934268 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934280 4824 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934291 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934303 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934315 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934327 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934340 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934352 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934364 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934376 4824 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934388 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934400 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934412 4824 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934424 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934437 4824 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934449 4824 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934461 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934473 4824 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934486 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934498 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934510 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.934546 4824 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.955401 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.955462 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.955478 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.955499 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.955512 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:00Z","lastTransitionTime":"2026-02-24T00:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.981383 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 00:07:00 crc kubenswrapper[4824]: I0224 00:07:00.996136 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.003368 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.004442 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f "/env/_master" ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: source "/env/_master" Feb 24 00:07:01 crc kubenswrapper[4824]: set +o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 24 00:07:01 crc kubenswrapper[4824]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 24 00:07:01 crc kubenswrapper[4824]: ho_enable="--enable-hybrid-overlay" Feb 24 00:07:01 crc kubenswrapper[4824]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 24 00:07:01 crc kubenswrapper[4824]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 24 00:07:01 crc kubenswrapper[4824]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-host=127.0.0.1 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-port=9743 \ Feb 24 00:07:01 crc kubenswrapper[4824]: ${ho_enable} \ Feb 24 00:07:01 crc kubenswrapper[4824]: --enable-interconnect \ Feb 24 00:07:01 crc kubenswrapper[4824]: --disable-approver \ Feb 24 00:07:01 crc kubenswrapper[4824]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --wait-for-kubernetes-api=200s \ Feb 24 00:07:01 crc kubenswrapper[4824]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --loglevel="${LOGLEVEL}" Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.013307 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f "/env/_master" ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: source "/env/_master" Feb 24 00:07:01 crc kubenswrapper[4824]: set +o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: Feb 24 00:07:01 crc kubenswrapper[4824]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --disable-webhook \ Feb 24 00:07:01 crc kubenswrapper[4824]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --loglevel="${LOGLEVEL}" Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.014565 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.021837 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.023072 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 24 00:07:01 crc kubenswrapper[4824]: W0224 00:07:01.023526 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-b39afc9921960eda1889ca26d8c3baefcdafc5477509b6dda05ed54213eb5e89 WatchSource:0}: Error finding container b39afc9921960eda1889ca26d8c3baefcdafc5477509b6dda05ed54213eb5e89: Status 404 returned error can't find the container with id b39afc9921960eda1889ca26d8c3baefcdafc5477509b6dda05ed54213eb5e89 Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.027085 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: source /etc/kubernetes/apiserver-url.env Feb 24 00:07:01 crc kubenswrapper[4824]: else Feb 24 00:07:01 crc kubenswrapper[4824]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 24 00:07:01 crc kubenswrapper[4824]: exit 1 Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.028207 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.058644 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.058697 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.058722 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.058750 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.058769 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.061422 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2538f68b39e89c4c421e500169ad97f720f05856bf8a53b21bdbe1c1af3454fd"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.063198 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2cebbd1b17e2d15613f1695389433d59bfc40c927a97a476b3d61d4415972ee2"} Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.063654 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.065091 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b39afc9921960eda1889ca26d8c3baefcdafc5477509b6dda05ed54213eb5e89"} Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.065778 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f "/env/_master" ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: source "/env/_master" Feb 24 00:07:01 crc kubenswrapper[4824]: set +o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 24 00:07:01 crc kubenswrapper[4824]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 24 00:07:01 crc kubenswrapper[4824]: ho_enable="--enable-hybrid-overlay" Feb 24 00:07:01 crc kubenswrapper[4824]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 24 00:07:01 crc kubenswrapper[4824]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 24 00:07:01 crc kubenswrapper[4824]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-host=127.0.0.1 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --webhook-port=9743 \ Feb 24 00:07:01 crc kubenswrapper[4824]: ${ho_enable} \ Feb 24 00:07:01 crc kubenswrapper[4824]: --enable-interconnect \ Feb 24 00:07:01 crc kubenswrapper[4824]: --disable-approver \ Feb 24 00:07:01 crc kubenswrapper[4824]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --wait-for-kubernetes-api=200s \ Feb 24 00:07:01 crc kubenswrapper[4824]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --loglevel="${LOGLEVEL}" Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.065917 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.066653 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: source /etc/kubernetes/apiserver-url.env Feb 24 00:07:01 crc kubenswrapper[4824]: else Feb 24 00:07:01 crc kubenswrapper[4824]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 24 00:07:01 crc kubenswrapper[4824]: exit 1 Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.067738 4824 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 24 00:07:01 crc kubenswrapper[4824]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 24 00:07:01 crc kubenswrapper[4824]: if [[ -f "/env/_master" ]]; then Feb 24 00:07:01 crc kubenswrapper[4824]: set -o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: source "/env/_master" Feb 24 00:07:01 crc kubenswrapper[4824]: set +o allexport Feb 24 00:07:01 crc kubenswrapper[4824]: fi Feb 24 00:07:01 crc kubenswrapper[4824]: Feb 24 00:07:01 crc kubenswrapper[4824]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 24 00:07:01 crc kubenswrapper[4824]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 24 00:07:01 crc kubenswrapper[4824]: --disable-webhook \ Feb 24 00:07:01 crc kubenswrapper[4824]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 24 00:07:01 crc kubenswrapper[4824]: --loglevel="${LOGLEVEL}" Feb 24 00:07:01 crc kubenswrapper[4824]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 24 00:07:01 crc kubenswrapper[4824]: > logger="UnhandledError" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.067830 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.069041 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.075475 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.089415 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.107405 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.117649 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.130344 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.143528 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.153207 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.162150 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.163387 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.163501 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.163594 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.163694 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.163788 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.172928 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.182202 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.191886 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.204387 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.267747 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.267816 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.267828 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.267848 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.267861 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.338743 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.338859 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.338923 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.339021 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.339039 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:02.33900453 +0000 UTC m=+86.328628999 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.339074 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.339102 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:02.339082262 +0000 UTC m=+86.328706731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.339120 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:02.339109713 +0000 UTC m=+86.328734182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.370369 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.370416 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.370430 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.370450 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.370463 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.439658 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.439725 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.439955 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.439981 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.439997 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.439992 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.440068 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:02.440049759 +0000 UTC m=+86.429674238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.440071 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.440092 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.440177 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:02.440150981 +0000 UTC m=+86.429775490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.475042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.475096 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.475106 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.475162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.475174 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.578023 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.578078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.578093 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.578114 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.578128 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.681440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.681494 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.681507 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.681543 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.681554 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.690718 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:09:28.567422229 +0000 UTC Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.693017 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:01 crc kubenswrapper[4824]: E0224 00:07:01.693173 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.783873 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.783953 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.783964 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.783982 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.783994 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.886578 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.886641 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.886658 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.886687 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.886707 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.989271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.989343 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.989353 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.989372 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:01 crc kubenswrapper[4824]: I0224 00:07:01.989382 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:01Z","lastTransitionTime":"2026-02-24T00:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.092757 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.093107 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.093262 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.093354 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.093444 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.195958 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.196620 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.196650 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.196674 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.196692 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.300158 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.300201 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.300215 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.300233 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.300244 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.349251 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.349356 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.349391 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.349506 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.349599 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:04.34955174 +0000 UTC m=+88.339176219 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.349660 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:04.349644712 +0000 UTC m=+88.339269271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.349753 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.349938 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:04.349889978 +0000 UTC m=+88.339514577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.403685 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.403735 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.403750 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.403768 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.403778 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.450571 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.450652 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.450893 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.450915 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.450910 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.450982 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.451007 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.450928 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.451106 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:04.451077331 +0000 UTC m=+88.440701840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.451140 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:04.451126302 +0000 UTC m=+88.440750811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.508321 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.508402 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.508415 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.508459 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.508473 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.590349 4824 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.612362 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.612433 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.612452 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.612481 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.612502 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.691318 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:13:43.475245706 +0000 UTC Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.693918 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.694194 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.694390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:02 crc kubenswrapper[4824]: E0224 00:07:02.694707 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.701178 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.702171 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.704067 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.705894 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.708642 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.710069 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.711576 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.714926 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.716017 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.716257 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.716459 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.716725 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.716949 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.717331 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.719456 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.720700 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.722323 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.723506 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.724671 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.728209 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.730741 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.733789 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.734703 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.736120 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.738609 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.740007 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.742575 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.743622 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.746036 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.747135 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.749542 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.751278 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.752595 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.754483 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.755738 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.757058 4824 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.757240 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.759757 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.761284 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.761891 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.764018 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.764925 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.766738 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.768349 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.771664 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.772976 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.775645 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.777222 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.778872 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.780055 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.781387 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.782645 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.784355 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.785831 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.786976 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.788840 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.789804 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.790858 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.791627 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.821844 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.822228 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.822426 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.822628 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.822786 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.926556 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.926637 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.926647 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.926667 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:02 crc kubenswrapper[4824]: I0224 00:07:02.926695 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:02Z","lastTransitionTime":"2026-02-24T00:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.030329 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.030418 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.030440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.030474 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.030500 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.134408 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.134502 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.134561 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.134598 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.134623 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.238027 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.238083 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.238093 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.238114 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.238127 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.341176 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.341222 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.341231 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.341247 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.341258 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.444838 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.444894 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.444908 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.444927 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.444941 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.548728 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.548799 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.548817 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.548845 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.548863 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.651729 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.651788 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.651806 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.651829 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.651847 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.692110 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 03:31:39.497937442 +0000 UTC Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.693415 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:03 crc kubenswrapper[4824]: E0224 00:07:03.693670 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.754974 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.755014 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.755024 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.755041 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.755050 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.858414 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.858473 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.858488 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.858507 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.858574 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.962100 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.962169 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.962188 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.962216 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:03 crc kubenswrapper[4824]: I0224 00:07:03.962234 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:03Z","lastTransitionTime":"2026-02-24T00:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.065772 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.065827 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.065842 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.065868 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.065894 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.170166 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.170240 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.170249 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.170265 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.170277 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.273226 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.273271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.273284 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.273303 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.273318 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.369991 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.370088 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.370130 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:08.370091327 +0000 UTC m=+92.359715816 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.370164 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.370218 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.370268 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:08.370254202 +0000 UTC m=+92.359878691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.370301 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.370427 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:08.370394055 +0000 UTC m=+92.360018534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.376497 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.376543 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.376554 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.376573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.376592 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.471732 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.471793 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472046 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472104 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472120 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472211 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:08.472190083 +0000 UTC m=+92.461814572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472750 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472902 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.472995 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.473153 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:08.473129027 +0000 UTC m=+92.462753506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.479692 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.479745 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.479760 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.479782 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.479795 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.582975 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.583057 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.583071 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.583628 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.583765 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.686973 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.687063 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.687082 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.687111 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.687132 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.693222 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:04:25.431215024 +0000 UTC Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.693457 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.693512 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.693887 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:04 crc kubenswrapper[4824]: E0224 00:07:04.693899 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.790868 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.790918 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.790936 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.790957 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.790969 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.893566 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.893610 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.893622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.893640 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.893651 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.996333 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.996376 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.996389 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.996406 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:04 crc kubenswrapper[4824]: I0224 00:07:04.996416 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:04Z","lastTransitionTime":"2026-02-24T00:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.099161 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.099219 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.099235 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.099258 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.099274 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.202704 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.202747 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.202759 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.202775 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.202784 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.306329 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.306385 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.306397 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.306426 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.306441 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.409391 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.409491 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.409549 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.409590 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.409618 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.512658 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.512712 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.512724 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.512745 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.512759 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.615297 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.615338 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.615347 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.615363 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.615373 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.692859 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:05 crc kubenswrapper[4824]: E0224 00:07:05.693043 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.693921 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:42:35.917051813 +0000 UTC Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.718873 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.718947 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.718966 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.718996 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.719014 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.822271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.822346 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.822372 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.822404 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.822430 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.924899 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.924944 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.924955 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.924971 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:05 crc kubenswrapper[4824]: I0224 00:07:05.924982 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:05Z","lastTransitionTime":"2026-02-24T00:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.027936 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.027988 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.027999 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.028015 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.028027 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.131197 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.131245 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.131258 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.131277 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.131290 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.233882 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.233940 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.233953 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.233971 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.233986 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.337229 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.337775 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.337928 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.338080 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.338222 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.442509 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.442585 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.442595 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.442620 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.442633 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.545502 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.545580 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.545596 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.545619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.545635 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.649603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.649888 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.649956 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.650034 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.650164 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.693309 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.693430 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.694276 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 23:22:08.239732663 +0000 UTC Feb 24 00:07:06 crc kubenswrapper[4824]: E0224 00:07:06.694172 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:06 crc kubenswrapper[4824]: E0224 00:07:06.694354 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.705696 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.715271 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.722979 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.733318 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.742849 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.752398 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.752430 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.752440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.752471 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.752482 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.753684 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.856296 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.856336 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.856347 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.856364 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.856376 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.959036 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.959118 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.959128 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.959143 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:06 crc kubenswrapper[4824]: I0224 00:07:06.959156 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:06Z","lastTransitionTime":"2026-02-24T00:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.063118 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.063153 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.063162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.063177 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.063191 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.165552 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.165906 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.166113 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.166220 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.166284 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.268854 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.268917 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.268936 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.268962 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.268982 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.371320 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.371708 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.371887 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.372010 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.372115 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.475025 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.475081 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.475094 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.475112 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.475124 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.578801 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.579218 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.579290 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.579362 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.579466 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.682350 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.682409 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.682423 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.682444 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.682459 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.693534 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:07 crc kubenswrapper[4824]: E0224 00:07:07.694201 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.694409 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 01:03:42.991021463 +0000 UTC Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.727581 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:07:07 crc kubenswrapper[4824]: E0224 00:07:07.727907 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.728513 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.785889 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.785976 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.786039 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.786073 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.786145 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.889274 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.889352 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.889366 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.889463 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.889483 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.995087 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.995454 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.995597 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.995693 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:07 crc kubenswrapper[4824]: I0224 00:07:07.995782 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:07Z","lastTransitionTime":"2026-02-24T00:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.085753 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.085978 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.098906 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.098949 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.098963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.098981 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.098993 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.201887 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.201930 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.201939 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.201955 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.201970 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.304924 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.304975 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.304989 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.305011 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.305023 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.407771 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.407883 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.407985 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:16.40795859 +0000 UTC m=+100.397583069 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.408056 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.408120 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:16.408102764 +0000 UTC m=+100.397727253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408056 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.408156 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408186 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408204 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.408213 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:16.408199727 +0000 UTC m=+100.397824206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408228 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.408245 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.509460 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.509585 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509739 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509783 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509800 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509861 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509876 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:16.509855351 +0000 UTC m=+100.499479840 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509903 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509918 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.509974 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:16.509956034 +0000 UTC m=+100.499580503 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.511002 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.511031 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.511041 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.511055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.511064 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.613703 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.614148 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.614302 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.614467 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.614652 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.693107 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.693201 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.694361 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:08 crc kubenswrapper[4824]: E0224 00:07:08.694571 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.694488 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:14:03.124980166 +0000 UTC Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.717973 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.718194 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.718205 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.718250 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.718264 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.821378 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.821426 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.821439 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.821466 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.821480 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.924066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.924116 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.924147 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.924166 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:08 crc kubenswrapper[4824]: I0224 00:07:08.924176 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:08Z","lastTransitionTime":"2026-02-24T00:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.027198 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.027589 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.027678 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.027781 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.027883 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.131305 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.131810 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.132029 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.132210 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.132394 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.236336 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.236389 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.236399 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.236419 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.236439 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.338804 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.338838 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.338847 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.338862 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.338875 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.441962 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.442042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.442067 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.442104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.442129 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.544861 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.545035 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.545062 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.545096 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.545119 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.648983 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.649066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.649084 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.649102 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.649114 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.692752 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:09 crc kubenswrapper[4824]: E0224 00:07:09.692987 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.695783 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:55:50.967137565 +0000 UTC Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.752511 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.752582 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.752623 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.752646 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.752663 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.855180 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.855272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.855286 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.855308 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.855323 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.958409 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.958498 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.958542 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.958574 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:09 crc kubenswrapper[4824]: I0224 00:07:09.958592 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:09Z","lastTransitionTime":"2026-02-24T00:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.061440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.061933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.062028 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.062130 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.062209 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.164859 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.164903 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.164919 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.164939 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.164952 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.267423 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.267756 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.267823 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.267950 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.268042 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.370824 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.371174 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.371238 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.371309 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.371393 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.466619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.466722 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.466746 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.466779 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.466808 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.483317 4824 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.486935 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.493038 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.493169 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.493188 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.493214 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.493231 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.514580 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.520590 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.520644 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.520660 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.520680 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.520694 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.537687 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.543019 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.543078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.543096 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.543124 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.543145 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.561693 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.567344 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.567404 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.567417 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.567443 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.567459 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.583094 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.583292 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.585504 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.585603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.585628 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.585665 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.585687 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.689958 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.690026 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.690038 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.690055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.690070 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.693806 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.693920 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.693951 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:10 crc kubenswrapper[4824]: E0224 00:07:10.694194 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.696496 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:17:48.634389171 +0000 UTC Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.793254 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.793327 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.793343 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.793367 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.793388 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.895252 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.895295 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.895305 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.895321 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.895330 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.998025 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.998063 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.998073 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.998090 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:10 crc kubenswrapper[4824]: I0224 00:07:10.998100 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:10Z","lastTransitionTime":"2026-02-24T00:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.100293 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.100344 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.100357 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.100377 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.100394 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.203283 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.203369 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.203395 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.203436 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.203462 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.306448 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.306499 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.306512 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.306548 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.306563 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.410289 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.410360 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.410380 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.410408 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.410430 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.513152 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.513206 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.513222 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.513246 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.513260 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.616669 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.616746 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.616769 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.616802 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.616825 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.692846 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:11 crc kubenswrapper[4824]: E0224 00:07:11.693436 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.696929 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:09:05.26078533 +0000 UTC Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.720557 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.720604 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.720621 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.720647 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.720664 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.823236 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.823299 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.823435 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.823467 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.823488 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.926988 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.927047 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.927057 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.927075 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:11 crc kubenswrapper[4824]: I0224 00:07:11.927086 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:11Z","lastTransitionTime":"2026-02-24T00:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.029358 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.029433 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.029459 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.029491 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.029569 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.131492 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.131990 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.132154 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.132345 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.132523 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.236290 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.236856 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.237078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.237309 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.237741 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.341131 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.341187 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.341199 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.341220 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.341234 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.443900 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.444391 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.444572 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.444734 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.444877 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.547691 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.547762 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.547781 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.547809 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.547830 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.651215 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.651287 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.651312 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.651344 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.651372 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.693510 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.693605 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:12 crc kubenswrapper[4824]: E0224 00:07:12.693790 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:12 crc kubenswrapper[4824]: E0224 00:07:12.694243 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.699025 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:24:54.695768496 +0000 UTC Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.753858 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.753917 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.753931 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.753952 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.753971 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.858002 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.858094 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.858109 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.858132 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.858148 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.962094 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.962149 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.962163 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.962185 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:12 crc kubenswrapper[4824]: I0224 00:07:12.962197 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:12Z","lastTransitionTime":"2026-02-24T00:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.065455 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.065567 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.065588 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.065617 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.065641 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.168050 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.168114 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.168133 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.168162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.168182 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.271695 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.271811 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.271841 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.271877 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.271901 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.375054 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.375150 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.375186 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.375221 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.375249 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.478703 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.478801 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.478826 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.478852 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.478872 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.581170 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.581254 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.581280 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.581316 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.581343 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.685235 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.685322 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.685349 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.685382 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.685410 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.692849 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:13 crc kubenswrapper[4824]: E0224 00:07:13.693288 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.699639 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 04:49:03.415965624 +0000 UTC Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.787933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.787994 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.788013 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.788042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.788062 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.891480 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.891565 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.891579 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.891715 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.891729 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.994542 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.994576 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.994585 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.994601 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:13 crc kubenswrapper[4824]: I0224 00:07:13.994614 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:13Z","lastTransitionTime":"2026-02-24T00:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.097015 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.097053 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.097063 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.097079 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.097090 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.106559 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.106645 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.125435 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.140636 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.152565 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.166742 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.178492 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.191660 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.199872 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.199921 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.199940 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.199985 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.200004 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.206297 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.303008 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.303066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.303077 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.303097 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.303112 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.405624 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.405721 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.405733 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.405751 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.405762 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.509356 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.509394 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.509404 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.509421 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.509431 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.613364 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.613444 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.613482 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.613514 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.613585 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.693288 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.693340 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:14 crc kubenswrapper[4824]: E0224 00:07:14.693426 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:14 crc kubenswrapper[4824]: E0224 00:07:14.693688 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.700027 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:23:53.533786416 +0000 UTC Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.720457 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.720559 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.720583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.720613 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.720639 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.823769 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.823808 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.823817 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.823836 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.823848 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.927449 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.927975 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.928153 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.928331 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:14 crc kubenswrapper[4824]: I0224 00:07:14.928495 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:14Z","lastTransitionTime":"2026-02-24T00:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.032455 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.033111 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.033349 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.033601 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.033824 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.137115 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.137181 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.137200 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.137226 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.137246 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.240686 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.240765 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.240791 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.240823 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.240848 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.344450 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.344560 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.344578 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.344613 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.344637 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.447986 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.448044 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.448057 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.448078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.448095 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.551373 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.551456 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.551479 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.551507 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.551573 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.654681 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.654730 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.654744 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.654763 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.654778 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.693390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:15 crc kubenswrapper[4824]: E0224 00:07:15.693554 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.700251 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:22:55.159844998 +0000 UTC Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.758616 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.758662 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.758673 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.758694 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.758706 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.866044 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.866206 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.866220 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.866242 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.866265 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.969544 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.969597 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.969607 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.969629 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:15 crc kubenswrapper[4824]: I0224 00:07:15.969642 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:15Z","lastTransitionTime":"2026-02-24T00:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.073084 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.073621 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.073635 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.073659 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.073677 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.176357 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.176434 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.176462 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.176495 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.176548 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.280850 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.280909 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.280922 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.280944 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.280961 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.384765 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.384813 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.384825 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.384843 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.384857 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.488714 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.488791 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.488815 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.488845 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.488868 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.491425 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.491612 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.491685 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:07:32.49165666 +0000 UTC m=+116.481281159 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.491735 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.491875 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.491883 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.491970 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:32.491946648 +0000 UTC m=+116.481571147 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.492009 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:32.491991649 +0000 UTC m=+116.481616158 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.591913 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.591971 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.591992 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.592022 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.592044 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.592258 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.592320 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592600 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592643 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592669 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592771 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:32.59273762 +0000 UTC m=+116.582362129 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592811 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592850 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592874 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.592950 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:32.592926585 +0000 UTC m=+116.582551084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.693412 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.693396 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.694014 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:16 crc kubenswrapper[4824]: E0224 00:07:16.694132 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.695204 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.695270 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.695290 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.695348 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.695362 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.700901 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:46:23.399819054 +0000 UTC Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.712868 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.726033 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.741858 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.758870 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.774543 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.785347 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.798680 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.798728 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.798740 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.798760 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.798773 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.799680 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.901820 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.901892 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.901904 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.901922 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:16 crc kubenswrapper[4824]: I0224 00:07:16.901933 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:16Z","lastTransitionTime":"2026-02-24T00:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.005173 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.005238 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.005256 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.005281 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.005300 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.109560 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.109621 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.109636 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.109658 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.109674 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.120151 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.134726 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.149685 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.160733 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.178833 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.193492 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.208701 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.213824 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.213873 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.213906 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.213924 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.213935 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.229627 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:17Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.317317 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.317384 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.317402 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.317425 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.317441 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.420481 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.420583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.420602 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.420629 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.420649 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.524125 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.524173 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.524186 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.524206 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.524224 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.627346 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.627382 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.627391 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.627407 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.627417 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.693088 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:17 crc kubenswrapper[4824]: E0224 00:07:17.693300 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.701411 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 10:15:04.877651357 +0000 UTC Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.730796 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.730842 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.730854 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.730873 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.730886 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.833611 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.833662 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.833673 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.833690 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.833701 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.936901 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.936977 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.937012 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.937042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:17 crc kubenswrapper[4824]: I0224 00:07:17.937054 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:17Z","lastTransitionTime":"2026-02-24T00:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.039611 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.039667 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.039678 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.039706 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.039725 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.125155 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.142677 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.142752 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.142772 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.142802 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.142823 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.143267 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.160158 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.174723 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.190399 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.208997 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.229013 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.246874 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.246957 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.246976 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.247035 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.247055 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.250151 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:18Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.350663 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.350718 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.350728 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.350746 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.350758 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.454635 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.454713 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.454736 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.454768 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.454793 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.557372 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.557443 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.557467 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.557497 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.557548 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.660044 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.660104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.660127 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.660154 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.660171 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.693756 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.693821 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:18 crc kubenswrapper[4824]: E0224 00:07:18.693975 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:18 crc kubenswrapper[4824]: E0224 00:07:18.694103 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.702267 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:15:25.610795511 +0000 UTC Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.763575 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.763630 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.763647 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.763669 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.763685 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.866943 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.867399 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.867583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.867762 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.867916 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.971215 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.971261 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.971272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.971291 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:18 crc kubenswrapper[4824]: I0224 00:07:18.971304 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:18Z","lastTransitionTime":"2026-02-24T00:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.074727 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.074814 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.074842 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.074877 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.074897 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.178636 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.179032 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.179253 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.179515 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.179745 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.283031 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.283835 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.283868 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.283901 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.283917 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.387252 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.387318 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.387335 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.387398 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.387417 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.491653 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.491733 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.491778 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.491823 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.491848 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.595351 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.595399 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.595411 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.595440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.595452 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.693717 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:19 crc kubenswrapper[4824]: E0224 00:07:19.693874 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.698129 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.698191 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.698206 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.698229 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.698246 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.702749 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 00:56:45.697461011 +0000 UTC Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.800390 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.800485 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.800548 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.800591 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.800612 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.903969 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.904042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.904062 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.904089 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:19 crc kubenswrapper[4824]: I0224 00:07:19.904109 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:19Z","lastTransitionTime":"2026-02-24T00:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.007767 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.007838 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.007854 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.007876 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.007895 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.111556 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.111644 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.111663 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.111694 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.111714 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.215619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.215711 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.215761 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.215785 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.215835 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.319510 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.319573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.319583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.319603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.319616 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.423666 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.423747 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.423769 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.423795 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.423818 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.527940 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.527998 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.528014 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.528055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.528075 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.631954 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.632103 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.632132 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.632171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.632197 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.693256 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.693323 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.693457 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.693717 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.702851 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 03:42:42.009627066 +0000 UTC Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.735960 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.736025 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.736049 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.736082 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.736106 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.838890 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.838940 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.838952 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.838972 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.838986 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.840220 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.840273 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.840283 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.840300 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.840315 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.862034 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.867395 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.867433 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.867443 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.867460 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.867469 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.885947 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.891031 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.891085 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.891100 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.891122 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.891135 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.909470 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.913381 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.913449 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.913467 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.913494 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.913514 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.934366 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.939558 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.939602 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.939614 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.939633 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.939645 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.955727 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:20 crc kubenswrapper[4824]: E0224 00:07:20.955892 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.958660 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.958715 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.958740 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.958761 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:20 crc kubenswrapper[4824]: I0224 00:07:20.958778 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:20Z","lastTransitionTime":"2026-02-24T00:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.061876 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.061927 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.061938 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.061959 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.061974 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.164195 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.164253 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.164272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.164298 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.164318 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.267000 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.267090 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.267109 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.267134 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.267152 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.370569 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.370623 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.370636 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.370656 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.370668 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.473648 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.473692 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.473702 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.473725 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.473738 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.575940 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.576249 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.576348 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.576446 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.576552 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.680272 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.680324 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.680346 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.680373 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.680395 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.693302 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:21 crc kubenswrapper[4824]: E0224 00:07:21.693509 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.703613 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 17:54:18.704585546 +0000 UTC Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.783554 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.784039 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.784197 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.784348 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.784562 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.888297 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.888368 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.888382 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.888402 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.888418 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.991911 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.991968 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.991992 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.992022 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:21 crc kubenswrapper[4824]: I0224 00:07:21.992047 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:21Z","lastTransitionTime":"2026-02-24T00:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.095969 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.096053 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.096072 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.096098 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.096119 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.199813 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.199890 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.199907 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.199935 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.199955 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.303486 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.303581 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.303603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.303629 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.303649 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.406964 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.407019 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.407034 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.407057 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.407072 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.510505 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.510604 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.510653 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.510678 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.510701 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.576155 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nwxht"] Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.576608 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.578752 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.580795 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.581044 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.595394 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.607825 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.613253 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.613313 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.613326 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.613348 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.613363 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.617793 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.636248 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.651373 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.664331 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.668178 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxc9g\" (UniqueName: \"kubernetes.io/projected/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-kube-api-access-lxc9g\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.668242 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-hosts-file\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.676295 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.690968 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.693143 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.693541 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:22 crc kubenswrapper[4824]: E0224 00:07:22.693725 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:22 crc kubenswrapper[4824]: E0224 00:07:22.693900 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.694166 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.703829 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 14:35:05.37190417 +0000 UTC Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.718663 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.718726 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.718738 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.718761 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.718776 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.769057 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxc9g\" (UniqueName: \"kubernetes.io/projected/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-kube-api-access-lxc9g\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.769146 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-hosts-file\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.769325 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-hosts-file\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.797160 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxc9g\" (UniqueName: \"kubernetes.io/projected/ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf-kube-api-access-lxc9g\") pod \"node-resolver-nwxht\" (UID: \"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\") " pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.821508 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.821570 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.821584 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.821603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.821614 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.897945 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwxht" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.925896 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.925954 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.925973 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.926000 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.926020 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:22Z","lastTransitionTime":"2026-02-24T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.939691 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vcbgn"] Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.940074 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wvqfl"] Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.940348 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvqfl" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.940577 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.946936 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.946983 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947168 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947242 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947248 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947352 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947418 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947414 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-d64vq"] Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947507 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947584 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.947833 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.948638 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.951857 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.953844 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.961953 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.979926 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:22 crc kubenswrapper[4824]: I0224 00:07:22.994669 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.013399 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.028123 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.031066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.031104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.031117 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.031137 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.031150 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.041047 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.057723 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.069783 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072165 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-socket-dir-parent\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072221 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-netns\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072263 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-os-release\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072308 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-daemon-config\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072348 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-system-cni-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072373 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttlvz\" (UniqueName: \"kubernetes.io/projected/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-kube-api-access-ttlvz\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072410 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-multus-certs\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072441 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/939ca085-9383-42e6-b7d6-37f101137273-rootfs\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072466 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-os-release\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072558 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-etc-kubernetes\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072582 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cnibin\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072606 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-multus\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072637 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-conf-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072661 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/939ca085-9383-42e6-b7d6-37f101137273-mcd-auth-proxy-config\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072689 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072712 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-system-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072736 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072758 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnn7\" (UniqueName: \"kubernetes.io/projected/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-kube-api-access-qjnn7\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072783 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6f4s\" (UniqueName: \"kubernetes.io/projected/939ca085-9383-42e6-b7d6-37f101137273-kube-api-access-j6f4s\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072809 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939ca085-9383-42e6-b7d6-37f101137273-proxy-tls\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072835 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-hostroot\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072859 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cni-binary-copy\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072923 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-bin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.072976 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.073016 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-kubelet\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.073046 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.073085 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-k8s-cni-cncf-io\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.073138 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cnibin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.083704 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.096168 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.106741 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.121918 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.134284 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.134336 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.134347 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.134366 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.134381 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.138319 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.151382 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwxht" event={"ID":"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf","Type":"ContainerStarted","Data":"f7a3b89d5ba26394edfe0be92dd9665a1f87335d098f3cc58a29d34b6745d414"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.153108 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.153417 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.154994 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.155455 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.172108 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174324 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6f4s\" (UniqueName: \"kubernetes.io/projected/939ca085-9383-42e6-b7d6-37f101137273-kube-api-access-j6f4s\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174362 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939ca085-9383-42e6-b7d6-37f101137273-proxy-tls\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174382 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-hostroot\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174400 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174425 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cni-binary-copy\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174442 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-bin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174460 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-k8s-cni-cncf-io\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174478 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-kubelet\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174493 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174509 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cnibin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174543 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-os-release\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174561 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-socket-dir-parent\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174580 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-netns\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174576 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-hostroot\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174595 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-daemon-config\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174743 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-system-cni-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174777 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttlvz\" (UniqueName: \"kubernetes.io/projected/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-kube-api-access-ttlvz\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174807 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-multus-certs\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174833 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/939ca085-9383-42e6-b7d6-37f101137273-rootfs\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174875 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-os-release\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174911 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-etc-kubernetes\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174936 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cnibin\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174968 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/939ca085-9383-42e6-b7d6-37f101137273-mcd-auth-proxy-config\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.174994 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175018 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-multus\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175043 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-conf-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175060 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-system-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175078 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175100 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnn7\" (UniqueName: \"kubernetes.io/projected/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-kube-api-access-qjnn7\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175203 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-bin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175234 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-daemon-config\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175253 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-k8s-cni-cncf-io\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175282 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-kubelet\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175290 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cnibin\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175310 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-etc-kubernetes\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175358 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-os-release\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175396 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-socket-dir-parent\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175401 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-binary-copy\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175439 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-multus-certs\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175420 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-run-netns\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175489 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-system-cni-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175504 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cnibin\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175534 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-host-var-lib-cni-multus\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175555 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-conf-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175663 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175764 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/939ca085-9383-42e6-b7d6-37f101137273-rootfs\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175755 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-system-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175784 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-cni-binary-copy\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175788 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-multus-cni-dir\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.175825 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-os-release\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.176693 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.177184 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/939ca085-9383-42e6-b7d6-37f101137273-mcd-auth-proxy-config\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.189390 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.189704 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939ca085-9383-42e6-b7d6-37f101137273-proxy-tls\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.193130 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6f4s\" (UniqueName: \"kubernetes.io/projected/939ca085-9383-42e6-b7d6-37f101137273-kube-api-access-j6f4s\") pod \"machine-config-daemon-vcbgn\" (UID: \"939ca085-9383-42e6-b7d6-37f101137273\") " pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.197439 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnn7\" (UniqueName: \"kubernetes.io/projected/15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac-kube-api-access-qjnn7\") pod \"multus-wvqfl\" (UID: \"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\") " pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.200489 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.215212 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttlvz\" (UniqueName: \"kubernetes.io/projected/28309e58-76b2-4fe6-a1e5-569b6f0b3a5e-kube-api-access-ttlvz\") pod \"multus-additional-cni-plugins-d64vq\" (UID: \"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\") " pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.215758 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.230880 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.242341 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.242372 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.242380 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.242397 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.242406 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.246393 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.260714 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvqfl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.269939 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.274993 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:07:23 crc kubenswrapper[4824]: W0224 00:07:23.275045 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15b9ae43_8f87_4f2f_a8d9_b55c8fa986ac.slice/crio-eed966a1c7d3d7481120f71dd8191ff06e3b39fa4062245f1d7158b419a8aa0c WatchSource:0}: Error finding container eed966a1c7d3d7481120f71dd8191ff06e3b39fa4062245f1d7158b419a8aa0c: Status 404 returned error can't find the container with id eed966a1c7d3d7481120f71dd8191ff06e3b39fa4062245f1d7158b419a8aa0c Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.283993 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d64vq" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.287366 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: W0224 00:07:23.289191 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939ca085_9383_42e6_b7d6_37f101137273.slice/crio-345e2c140a4948daeb252c8f8ef30d7622bfb85fa4f8a60e66b19550c294dfb8 WatchSource:0}: Error finding container 345e2c140a4948daeb252c8f8ef30d7622bfb85fa4f8a60e66b19550c294dfb8: Status 404 returned error can't find the container with id 345e2c140a4948daeb252c8f8ef30d7622bfb85fa4f8a60e66b19550c294dfb8 Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.302167 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: W0224 00:07:23.309179 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28309e58_76b2_4fe6_a1e5_569b6f0b3a5e.slice/crio-9d09bc0d27b84e1e64488295d6c681e537e04c706cd059872bb297627c8e16fa WatchSource:0}: Error finding container 9d09bc0d27b84e1e64488295d6c681e537e04c706cd059872bb297627c8e16fa: Status 404 returned error can't find the container with id 9d09bc0d27b84e1e64488295d6c681e537e04c706cd059872bb297627c8e16fa Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.316143 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.333296 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.334361 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xjg6"] Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.337686 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347154 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347383 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347586 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347717 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347900 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.347966 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.348103 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.352815 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.352869 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.352883 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.352907 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.352921 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.353059 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.370954 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.382421 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.396120 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.410224 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.423867 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.437288 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.450951 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.455139 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.455162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.455171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.455185 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.455196 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.463365 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.481925 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.481978 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482005 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482028 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482053 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rnj\" (UniqueName: \"kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482074 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482095 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482120 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482142 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482164 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482205 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482226 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482249 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482268 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482288 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482309 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482334 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482354 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482384 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.482403 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.516715 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.544215 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.558171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.558483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.558589 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.558689 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.558777 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.568724 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.582281 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.582853 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.582900 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.582947 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.582964 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583002 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583020 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583026 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583046 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583049 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583070 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583075 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583096 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583097 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583069 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583098 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583357 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583409 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583478 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583557 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583561 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583613 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583625 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583613 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583681 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583704 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583733 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rnj\" (UniqueName: \"kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583740 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583758 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583787 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583798 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583825 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583824 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583839 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583922 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.583868 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.584018 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.585050 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.589137 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.595291 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.602861 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.605830 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rnj\" (UniqueName: \"kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj\") pod \"ovnkube-node-4xjg6\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.616441 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.632208 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.650167 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.662017 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.662042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.662050 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.662066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.662077 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.672787 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.676752 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.689015 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:23Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.693208 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:23 crc kubenswrapper[4824]: E0224 00:07:23.693359 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.704367 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:40:43.703549142 +0000 UTC Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.764566 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.764612 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.764621 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.764639 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.764651 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.867963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.868023 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.868035 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.868053 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.868067 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.974987 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.975031 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.975042 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.975080 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:23 crc kubenswrapper[4824]: I0224 00:07:23.975097 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:23Z","lastTransitionTime":"2026-02-24T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.082598 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.082649 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.082660 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.082677 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.082691 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.161082 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwxht" event={"ID":"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf","Type":"ContainerStarted","Data":"69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.162371 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" exitCode=0 Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.162431 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.162458 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"39c21b24d26f0ce7cc1f64fcb5e9960f6a2487988e095495d5e73beb90c5e099"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.164122 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0" exitCode=0 Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.164157 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.164215 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerStarted","Data":"9d09bc0d27b84e1e64488295d6c681e537e04c706cd059872bb297627c8e16fa"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.166387 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.166448 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.166469 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"345e2c140a4948daeb252c8f8ef30d7622bfb85fa4f8a60e66b19550c294dfb8"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.168486 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerStarted","Data":"4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.168559 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerStarted","Data":"eed966a1c7d3d7481120f71dd8191ff06e3b39fa4062245f1d7158b419a8aa0c"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.181260 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.187902 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.187956 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.187966 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.187983 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.187993 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.197700 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.216084 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.230263 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.248604 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.262611 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.275441 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.289319 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.290898 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.290932 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.290945 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.290963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.290976 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.305745 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.331814 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.353718 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.375378 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.392202 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.394029 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.394087 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.394100 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.394119 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.394131 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.417569 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.440531 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.461482 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.478224 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.491049 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.496130 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.496158 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.496167 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.496185 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.496199 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.511081 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.532176 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.547193 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.564732 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.578881 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.599019 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.599072 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.599085 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.599106 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.599122 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.602031 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:24Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.693696 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.693696 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:24 crc kubenswrapper[4824]: E0224 00:07:24.693870 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:24 crc kubenswrapper[4824]: E0224 00:07:24.693923 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.701093 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.701140 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.701150 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.701168 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.701181 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.705264 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:53:50.522666521 +0000 UTC Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.804002 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.804041 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.804052 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.804072 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.804083 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.912239 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.912285 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.912297 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.912316 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:24 crc kubenswrapper[4824]: I0224 00:07:24.912328 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:24Z","lastTransitionTime":"2026-02-24T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.015153 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.015286 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.015317 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.015351 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.015374 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.119171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.119265 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.119289 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.119325 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.119353 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176373 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176443 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176469 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176490 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176510 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.176555 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.182257 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerStarted","Data":"3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.203693 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.222773 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.222818 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.222834 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.222851 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.222864 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.235739 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.261655 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.290931 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.314183 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.329564 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.338717 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.338789 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.338806 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.338824 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.338834 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.350629 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.373374 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.392330 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.411741 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.425886 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.439847 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:25Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.442581 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.442619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.442628 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.442644 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.442657 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.545023 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.545068 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.545078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.545095 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.545107 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.657436 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.657545 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.657574 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.657606 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.657660 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.692930 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:25 crc kubenswrapper[4824]: E0224 00:07:25.693132 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.706040 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 21:16:17.964738619 +0000 UTC Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.767767 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.767818 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.767834 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.767859 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.767877 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.871471 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.871564 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.871584 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.871610 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.871627 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.974681 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.974770 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.974789 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.974817 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:25 crc kubenswrapper[4824]: I0224 00:07:25.974839 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:25Z","lastTransitionTime":"2026-02-24T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.079569 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.079894 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.079905 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.079922 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.079933 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.183684 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.183737 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.183747 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.183766 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.183779 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.187251 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80" exitCode=0 Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.187315 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.209004 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.227728 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.242087 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.263675 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.284497 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.287470 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.287504 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.287517 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.287649 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.287663 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.298966 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.312804 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.322382 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.334573 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.348049 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.362062 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.381907 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.390622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.390661 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.390671 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.390689 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.390698 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.493723 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.494139 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.494151 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.494169 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.494180 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.597279 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.597316 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.597324 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.597338 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.597350 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.693838 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.693838 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:26 crc kubenswrapper[4824]: E0224 00:07:26.694040 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:26 crc kubenswrapper[4824]: E0224 00:07:26.694181 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.701994 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.702050 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.702067 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.702090 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.702109 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.706261 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:28:14.590885794 +0000 UTC Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.713036 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.733194 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.754901 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.785069 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809360 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809464 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809510 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809562 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.809928 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.825176 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.838645 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.855300 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.869647 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.882810 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.897100 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.910981 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.912238 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.912271 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.912280 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.912299 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:26 crc kubenswrapper[4824]: I0224 00:07:26.912309 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:26Z","lastTransitionTime":"2026-02-24T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.015101 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.015131 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.015139 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.015156 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.015166 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.117920 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.117967 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.117985 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.118011 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.118028 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.197382 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.203017 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154" exitCode=0 Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.203094 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.216772 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.221548 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.221596 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.221607 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.221644 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.221657 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.230081 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.244938 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.260328 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.274013 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.289180 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.310062 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.324624 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.324676 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.324688 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.324708 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.324720 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.325555 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.336322 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.351402 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.370042 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.385552 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.427573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.427618 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.427631 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.427650 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.427665 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.530461 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.530513 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.530542 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.530563 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.530576 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.634629 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.635067 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.635093 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.635127 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.635151 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.693353 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:27 crc kubenswrapper[4824]: E0224 00:07:27.693641 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.706686 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 03:04:56.561678482 +0000 UTC Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.738831 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.738887 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.738908 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.738933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.738952 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.845566 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.845609 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.845619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.845635 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.845644 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.948084 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.948122 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.948130 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.948145 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:27 crc kubenswrapper[4824]: I0224 00:07:27.948154 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:27Z","lastTransitionTime":"2026-02-24T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.051172 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.051205 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.051214 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.051229 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.051238 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.154101 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.154135 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.154142 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.154157 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.154166 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.217902 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50" exitCode=0 Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.217957 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.250782 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.256129 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.256162 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.256171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.256186 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.256197 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.288996 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.320068 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.335241 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.348632 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.363512 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.363563 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.363577 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.363596 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.363609 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.366577 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.380828 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.392432 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.407073 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.423376 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.439940 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.459620 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:28Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.465505 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.465557 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.465567 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.465583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.465593 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.569797 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.569874 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.569892 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.569938 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.569952 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.673063 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.673130 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.673145 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.673170 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.673190 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.693498 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.693498 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:28 crc kubenswrapper[4824]: E0224 00:07:28.693711 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:28 crc kubenswrapper[4824]: E0224 00:07:28.693771 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.706881 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 19:05:59.065908415 +0000 UTC Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.776094 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.776159 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.776170 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.776194 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.776211 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.879808 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.879867 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.879881 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.879902 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.879915 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.983361 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.983458 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.983482 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.983509 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:28 crc kubenswrapper[4824]: I0224 00:07:28.983570 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:28Z","lastTransitionTime":"2026-02-24T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.089892 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.090306 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.090318 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.090338 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.090351 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.192663 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.192711 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.192726 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.192746 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.192763 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.228173 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.229174 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.229298 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.229742 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.237203 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerStarted","Data":"43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.251388 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.261790 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.264495 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.273668 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.294865 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.295830 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.295873 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.295887 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.295907 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.295920 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.316822 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.335245 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.352541 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.374842 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.397952 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.398009 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.398026 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.398051 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.398068 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.402776 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.424620 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.438375 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.456675 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.473504 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.489445 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.502229 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.502299 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.502315 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.502341 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.502357 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.505535 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.523172 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.545891 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.561163 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.579009 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.596404 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.604680 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.604740 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.604755 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.604781 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.604797 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.615254 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.630980 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.647811 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.663256 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.679188 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.693474 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:29 crc kubenswrapper[4824]: E0224 00:07:29.693696 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.707241 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.707277 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.707289 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.707307 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.707319 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.713917 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 00:34:01.214503728 +0000 UTC Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.782023 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2zsq6"] Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.782538 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.787076 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.787473 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.788004 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.788592 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.803704 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.810008 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.810039 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.810050 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.810068 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.810081 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.820147 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.838133 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.869928 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.889935 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.917922 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.917965 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.917975 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.917996 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.918006 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:29Z","lastTransitionTime":"2026-02-24T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.921488 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.940014 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.954802 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.957427 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-host\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.957502 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-serviceca\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.957619 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkj25\" (UniqueName: \"kubernetes.io/projected/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-kube-api-access-dkj25\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.968343 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.983844 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:29 crc kubenswrapper[4824]: I0224 00:07:29.997154 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:29Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.010958 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:30Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.022457 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.022505 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.022530 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.022554 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.022564 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.026471 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:30Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.058636 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-host\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.058702 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-serviceca\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.058726 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkj25\" (UniqueName: \"kubernetes.io/projected/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-kube-api-access-dkj25\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.058796 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-host\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.059874 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-serviceca\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.076883 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkj25\" (UniqueName: \"kubernetes.io/projected/bffd69c2-56a8-4fa0-9fbf-82a508f80ec1-kube-api-access-dkj25\") pod \"node-ca-2zsq6\" (UID: \"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\") " pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.100604 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2zsq6" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.124210 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.124289 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.124302 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.124323 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.124335 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.228323 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.228387 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.228414 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.228437 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.228453 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.243123 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2zsq6" event={"ID":"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1","Type":"ContainerStarted","Data":"8ac7a616142a433b8c356f5835c547c6346851e18c553b7653b7003e821c9a50"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.332653 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.332906 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.332916 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.332933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.332944 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.435576 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.435622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.435634 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.435653 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.435666 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.539887 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.539921 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.539930 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.539946 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.539956 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.644368 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.644421 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.644435 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.644453 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.644465 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.693801 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.693899 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:30 crc kubenswrapper[4824]: E0224 00:07:30.694003 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:30 crc kubenswrapper[4824]: E0224 00:07:30.694622 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.714109 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 19:04:29.170569902 +0000 UTC Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.755023 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.755094 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.755116 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.755149 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.755173 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.859377 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.859450 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.859469 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.859498 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.859650 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.962132 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.962183 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.962194 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.962212 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:30 crc kubenswrapper[4824]: I0224 00:07:30.962225 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:30Z","lastTransitionTime":"2026-02-24T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.065840 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.065894 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.065917 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.065937 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.065952 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.168938 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.168975 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.168984 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.168998 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.169008 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.222011 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.222052 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.222059 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.222078 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.222088 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.240312 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.245709 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.245751 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.245764 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.245784 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.245801 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.247671 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2zsq6" event={"ID":"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1","Type":"ContainerStarted","Data":"4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.257253 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268" exitCode=0 Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.257363 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268"} Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.271486 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.278723 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.282360 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.282419 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.282439 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.282470 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.282490 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.297453 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.301377 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.306312 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.306346 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.306356 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.306374 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.306385 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.312269 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.323396 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.328255 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.328301 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.328317 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.328341 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.328357 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.334651 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.346712 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.346879 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.348623 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.348653 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.348664 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.348682 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.348694 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.355562 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.371668 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.387653 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.402176 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.418181 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.430468 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.444585 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.451606 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.451647 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.451660 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.451682 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.451697 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.457572 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.471701 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.487605 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.504831 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.527042 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.542197 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.555056 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.555102 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.555115 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.555139 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.555152 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.558216 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.572076 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.590917 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.603179 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.611658 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.621881 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.635329 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.647707 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.659053 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.659102 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.659117 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.659408 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.659429 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.673563 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:31Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.693054 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:31 crc kubenswrapper[4824]: E0224 00:07:31.693201 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.714817 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 20:42:05.968491104 +0000 UTC Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.762731 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.762772 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.762784 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.762802 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.762816 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.866424 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.866838 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.867076 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.867171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.867255 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.976994 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.977061 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.977079 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.977108 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:31 crc kubenswrapper[4824]: I0224 00:07:31.977126 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:31Z","lastTransitionTime":"2026-02-24T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.084810 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.084889 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.084907 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.084935 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.084954 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.188542 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.188597 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.188609 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.188631 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.188644 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.273586 4824 generic.go:334] "Generic (PLEG): container finished" podID="28309e58-76b2-4fe6-a1e5-569b6f0b3a5e" containerID="86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45" exitCode=0 Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.273645 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerDied","Data":"86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.292259 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.292587 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.292682 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.292777 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.292866 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.300017 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.325272 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.350578 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.371759 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.389915 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.395736 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.395765 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.395773 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.395789 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.395801 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.403343 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.416609 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.433667 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.444380 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.454922 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.467948 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.482277 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.496133 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:32Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.498497 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.499037 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.499055 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.499072 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.499084 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.616230 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.616340 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.616376 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.616409 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.616442 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616557 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616584 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:08:04.616510799 +0000 UTC m=+148.606135308 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616644 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:04.616624322 +0000 UTC m=+148.606248821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616850 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616876 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616888 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616939 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.616942 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:04.616924499 +0000 UTC m=+148.606549158 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.617033 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:04.617017772 +0000 UTC m=+148.606642281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.617026 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.617070 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.617085 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.617152 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:04.617130545 +0000 UTC m=+148.606755014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.618320 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.618352 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.618364 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.618384 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.618399 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.694761 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.694962 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.695411 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:32 crc kubenswrapper[4824]: E0224 00:07:32.695556 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.716828 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:48:45.793589389 +0000 UTC Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.721771 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.721834 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.721853 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.721878 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.721896 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.825110 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.825163 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.825180 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.825204 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.825218 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.928464 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.928513 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.928564 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.928584 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:32 crc kubenswrapper[4824]: I0224 00:07:32.928595 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:32Z","lastTransitionTime":"2026-02-24T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.034028 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.034112 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.034142 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.034219 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.034248 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.137612 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.137658 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.137670 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.137690 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.137703 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.240386 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.240432 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.240443 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.240465 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.240477 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.287342 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" event={"ID":"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e","Type":"ContainerStarted","Data":"1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.291636 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/0.log" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.293510 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5" exitCode=1 Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.293546 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.294260 4824 scope.go:117] "RemoveContainer" containerID="9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.302738 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.321118 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.336772 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.342833 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.343003 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.343251 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.343390 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.343504 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.353320 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.373848 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.400731 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.416562 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.428860 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.443103 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.447785 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.447824 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.447836 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.447857 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.447873 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.460662 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.478745 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.495950 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.517315 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.546472 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.557134 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.557180 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.557192 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.557213 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.557225 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.565020 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.578720 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.592126 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.612647 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"message\\\":\\\"7:33.236354 6496 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236725 6496 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236821 6496 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237004 6496 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237441 6496 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237802 6496 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237929 6496 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.625268 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.637377 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.649476 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.660397 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.660447 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.660460 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.660484 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.660496 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.668679 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.681106 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.692768 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:33 crc kubenswrapper[4824]: E0224 00:07:33.692945 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.704931 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.717409 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:32:09.649442555 +0000 UTC Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.746888 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.763809 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.763855 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.763863 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.763880 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.763892 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.774343 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.865869 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.865909 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.865921 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.865942 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.865953 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.969338 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.969391 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.969403 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.969421 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:33 crc kubenswrapper[4824]: I0224 00:07:33.969434 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:33Z","lastTransitionTime":"2026-02-24T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.073586 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.073651 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.073673 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.073702 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.073726 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.177694 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.177760 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.177789 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.177817 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.177836 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.280430 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.280484 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.280498 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.280534 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.280551 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.306751 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/0.log" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.310064 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.310968 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.329355 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.346320 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.367038 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383429 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383469 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383481 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383499 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383512 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.383617 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.406365 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"message\\\":\\\"7:33.236354 6496 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236725 6496 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236821 6496 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237004 6496 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237441 6496 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237802 6496 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237929 6496 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.420426 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.431658 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.444257 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.460903 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.473747 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.486948 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.487017 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.487037 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.487064 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.487082 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.489596 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.505395 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.527588 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:34Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.590600 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.590642 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.590651 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.590670 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.590682 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.692749 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.692863 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:34 crc kubenswrapper[4824]: E0224 00:07:34.692923 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:34 crc kubenswrapper[4824]: E0224 00:07:34.693115 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.693994 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.694038 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.694051 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.694071 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.694086 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.718353 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 04:15:00.955625561 +0000 UTC Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.797066 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.797104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.797113 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.797130 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.797140 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.900048 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.900106 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.900119 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.900144 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:34 crc kubenswrapper[4824]: I0224 00:07:34.900159 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:34Z","lastTransitionTime":"2026-02-24T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.003420 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.003471 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.003483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.003504 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.003557 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.107339 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.107398 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.107415 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.107441 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.107462 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.210172 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.210231 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.210243 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.210265 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.210278 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.313030 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.313171 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.313191 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.313227 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.313247 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.317127 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/1.log" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.317962 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/0.log" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.322081 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952" exitCode=1 Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.322133 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.322188 4824 scope.go:117] "RemoveContainer" containerID="9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.323483 4824 scope.go:117] "RemoveContainer" containerID="9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952" Feb 24 00:07:35 crc kubenswrapper[4824]: E0224 00:07:35.323820 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.339327 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.359195 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.386388 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"message\\\":\\\"7:33.236354 6496 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236725 6496 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236821 6496 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237004 6496 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237441 6496 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237802 6496 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237929 6496 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.404203 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.415756 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.415800 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.415810 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.415829 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.415840 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.419479 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.435551 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.452093 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.465347 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.481505 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.495277 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.511593 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.518729 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.518790 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.518807 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.518834 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.518852 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.527619 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.546967 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.622511 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.622574 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.622584 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.622603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.622614 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.692934 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:35 crc kubenswrapper[4824]: E0224 00:07:35.693091 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.719641 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 02:51:28.213542899 +0000 UTC Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.726193 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.726235 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.726249 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.726270 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.726282 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.830624 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.830672 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.830684 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.830702 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.830712 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.835046 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh"] Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.836003 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.839973 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.840401 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.856752 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.857238 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0525cd89-44e0-47f1-856c-f566eb21596a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.857303 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.857368 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.857435 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smpb\" (UniqueName: \"kubernetes.io/projected/0525cd89-44e0-47f1-856c-f566eb21596a-kube-api-access-5smpb\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.872160 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.885711 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.902083 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.919928 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.934483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.934573 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.934591 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.934619 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.934640 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:35Z","lastTransitionTime":"2026-02-24T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.936924 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.954449 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.958534 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0525cd89-44e0-47f1-856c-f566eb21596a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.958588 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.958612 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.958638 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smpb\" (UniqueName: \"kubernetes.io/projected/0525cd89-44e0-47f1-856c-f566eb21596a-kube-api-access-5smpb\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.959413 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.959415 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0525cd89-44e0-47f1-856c-f566eb21596a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.965005 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0525cd89-44e0-47f1-856c-f566eb21596a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.975354 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a925fe48fd97ac70a501bb5e22a03c571cd645c65d28ba6ce3318d93c26c5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"message\\\":\\\"7:33.236354 6496 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236725 6496 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.236821 6496 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237004 6496 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0224 00:07:33.237441 6496 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237802 6496 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 00:07:33.237929 6496 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.979361 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smpb\" (UniqueName: \"kubernetes.io/projected/0525cd89-44e0-47f1-856c-f566eb21596a-kube-api-access-5smpb\") pod \"ovnkube-control-plane-749d76644c-dbxhh\" (UID: \"0525cd89-44e0-47f1-856c-f566eb21596a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:35 crc kubenswrapper[4824]: I0224 00:07:35.990412 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:35Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.001894 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.014499 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.030871 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.038109 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.038147 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.038157 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.038174 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.038184 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.043162 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.053426 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.140483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.140545 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.140557 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.140574 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.140584 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.148820 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" Feb 24 00:07:36 crc kubenswrapper[4824]: W0224 00:07:36.162598 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0525cd89_44e0_47f1_856c_f566eb21596a.slice/crio-315b104f0a21ba0ac891b8dceb73312e4e9d79eab578a71f2940b8799eee58e3 WatchSource:0}: Error finding container 315b104f0a21ba0ac891b8dceb73312e4e9d79eab578a71f2940b8799eee58e3: Status 404 returned error can't find the container with id 315b104f0a21ba0ac891b8dceb73312e4e9d79eab578a71f2940b8799eee58e3 Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.243099 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.243143 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.243153 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.243170 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.243182 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.329150 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/1.log" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.333680 4824 scope.go:117] "RemoveContainer" containerID="9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.333738 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" event={"ID":"0525cd89-44e0-47f1-856c-f566eb21596a","Type":"ContainerStarted","Data":"315b104f0a21ba0ac891b8dceb73312e4e9d79eab578a71f2940b8799eee58e3"} Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.333943 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.345652 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.345693 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.345703 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.345719 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.345730 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.349442 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.362908 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.376146 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.394132 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.408468 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.418284 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.428740 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.442282 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.448758 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.448807 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.448822 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.448842 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.448855 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.451893 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.462894 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.474132 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.486284 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.505806 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.520595 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.551816 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.551860 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.551868 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.551884 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.551894 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:36Z","lastTransitionTime":"2026-02-24T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.581248 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-98z42"] Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.581860 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.581925 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.596508 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.608736 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.619577 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.630763 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.650346 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.652384 4824 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.662631 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.666819 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svh6v\" (UniqueName: \"kubernetes.io/projected/a648113f-3e46-4170-ba30-7155fefbb413-kube-api-access-svh6v\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.666927 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.673275 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.686709 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.693109 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.693200 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.693245 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.693449 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.699692 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.716757 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.719886 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 05:55:19.707900619 +0000 UTC Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.728133 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.740374 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.755755 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.767405 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.767454 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svh6v\" (UniqueName: \"kubernetes.io/projected/a648113f-3e46-4170-ba30-7155fefbb413-kube-api-access-svh6v\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.767895 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.767950 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:37.267931282 +0000 UTC m=+121.257555761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.771206 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: E0224 00:07:36.771629 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.782365 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.794652 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.801236 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svh6v\" (UniqueName: \"kubernetes.io/projected/a648113f-3e46-4170-ba30-7155fefbb413-kube-api-access-svh6v\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.804780 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.814926 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.830096 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.841087 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.853379 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.868849 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.881302 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.893282 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.911771 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.930018 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.946474 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.959838 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.974310 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:36 crc kubenswrapper[4824]: I0224 00:07:36.993084 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:36Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.272405 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:37 crc kubenswrapper[4824]: E0224 00:07:37.272600 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:37 crc kubenswrapper[4824]: E0224 00:07:37.272670 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:38.272652504 +0000 UTC m=+122.262276973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.339605 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" event={"ID":"0525cd89-44e0-47f1-856c-f566eb21596a","Type":"ContainerStarted","Data":"1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7"} Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.339663 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" event={"ID":"0525cd89-44e0-47f1-856c-f566eb21596a","Type":"ContainerStarted","Data":"391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6"} Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.358558 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.373353 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.390983 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.401546 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.419430 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.432651 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.446149 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.458264 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.476094 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.488368 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.503612 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.517337 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.534795 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.546933 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.559020 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:37Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.693122 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:37 crc kubenswrapper[4824]: E0224 00:07:37.693279 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:37 crc kubenswrapper[4824]: I0224 00:07:37.720590 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 22:07:55.576620853 +0000 UTC Feb 24 00:07:38 crc kubenswrapper[4824]: I0224 00:07:38.282217 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:38 crc kubenswrapper[4824]: E0224 00:07:38.282358 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:38 crc kubenswrapper[4824]: E0224 00:07:38.282444 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:40.282427054 +0000 UTC m=+124.272051513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:38 crc kubenswrapper[4824]: I0224 00:07:38.692727 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:38 crc kubenswrapper[4824]: I0224 00:07:38.692818 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:38 crc kubenswrapper[4824]: E0224 00:07:38.692871 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:38 crc kubenswrapper[4824]: I0224 00:07:38.692839 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:38 crc kubenswrapper[4824]: E0224 00:07:38.693005 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:38 crc kubenswrapper[4824]: E0224 00:07:38.693182 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:38 crc kubenswrapper[4824]: I0224 00:07:38.720863 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 19:28:15.024243713 +0000 UTC Feb 24 00:07:39 crc kubenswrapper[4824]: I0224 00:07:39.692949 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:39 crc kubenswrapper[4824]: E0224 00:07:39.693311 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:39 crc kubenswrapper[4824]: I0224 00:07:39.720955 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:18:38.12807262 +0000 UTC Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.302787 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:40 crc kubenswrapper[4824]: E0224 00:07:40.302989 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:40 crc kubenswrapper[4824]: E0224 00:07:40.303104 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:44.303076224 +0000 UTC m=+128.292700703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.362110 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.378603 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.393773 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.409841 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.425008 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.435831 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.447727 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.463536 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.476640 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.491985 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.504102 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.520198 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.533437 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.550174 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.563354 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.584589 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:40Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.693606 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.693762 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:40 crc kubenswrapper[4824]: E0224 00:07:40.693847 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.693880 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:40 crc kubenswrapper[4824]: E0224 00:07:40.694028 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:40 crc kubenswrapper[4824]: E0224 00:07:40.694148 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:40 crc kubenswrapper[4824]: I0224 00:07:40.722002 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:48:24.293246484 +0000 UTC Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.475288 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.475355 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.475372 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.475403 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.475421 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:41Z","lastTransitionTime":"2026-02-24T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.492475 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.497415 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.497453 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.497464 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.497483 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.497496 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:41Z","lastTransitionTime":"2026-02-24T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.511257 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.515961 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.516012 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.516023 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.516045 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.516059 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:41Z","lastTransitionTime":"2026-02-24T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.535261 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.539774 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.539826 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.539837 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.539857 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.539870 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:41Z","lastTransitionTime":"2026-02-24T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.556022 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.560197 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.560233 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.560243 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.560263 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.560274 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:41Z","lastTransitionTime":"2026-02-24T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.579369 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:41Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.579567 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.693256 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.693453 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:41 crc kubenswrapper[4824]: I0224 00:07:41.723195 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:55:37.720976079 +0000 UTC Feb 24 00:07:41 crc kubenswrapper[4824]: E0224 00:07:41.773743 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:07:42 crc kubenswrapper[4824]: I0224 00:07:42.693489 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:42 crc kubenswrapper[4824]: I0224 00:07:42.693610 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:42 crc kubenswrapper[4824]: I0224 00:07:42.693793 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:42 crc kubenswrapper[4824]: E0224 00:07:42.693776 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:42 crc kubenswrapper[4824]: E0224 00:07:42.693970 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:42 crc kubenswrapper[4824]: E0224 00:07:42.694150 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:42 crc kubenswrapper[4824]: I0224 00:07:42.723861 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:30:19.499639207 +0000 UTC Feb 24 00:07:43 crc kubenswrapper[4824]: I0224 00:07:43.692960 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:43 crc kubenswrapper[4824]: E0224 00:07:43.693137 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:43 crc kubenswrapper[4824]: I0224 00:07:43.724881 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:23:35.46281603 +0000 UTC Feb 24 00:07:44 crc kubenswrapper[4824]: I0224 00:07:44.349525 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:44 crc kubenswrapper[4824]: E0224 00:07:44.349707 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:44 crc kubenswrapper[4824]: E0224 00:07:44.349809 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:07:52.349782714 +0000 UTC m=+136.339407183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:44 crc kubenswrapper[4824]: I0224 00:07:44.695684 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:44 crc kubenswrapper[4824]: E0224 00:07:44.695827 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:44 crc kubenswrapper[4824]: I0224 00:07:44.696028 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:44 crc kubenswrapper[4824]: E0224 00:07:44.696093 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:44 crc kubenswrapper[4824]: I0224 00:07:44.696372 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:44 crc kubenswrapper[4824]: E0224 00:07:44.696436 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:44 crc kubenswrapper[4824]: I0224 00:07:44.725239 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:07:51.424252425 +0000 UTC Feb 24 00:07:45 crc kubenswrapper[4824]: I0224 00:07:45.693092 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:45 crc kubenswrapper[4824]: E0224 00:07:45.693268 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:45 crc kubenswrapper[4824]: I0224 00:07:45.726164 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 17:24:09.116770044 +0000 UTC Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.693439 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.693550 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:46 crc kubenswrapper[4824]: E0224 00:07:46.693642 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.693455 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:46 crc kubenswrapper[4824]: E0224 00:07:46.693725 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:46 crc kubenswrapper[4824]: E0224 00:07:46.693819 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.713790 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.727033 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 22:35:47.501804223 +0000 UTC Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.743697 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.758803 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.770461 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: E0224 00:07:46.775302 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.787715 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.806472 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.819620 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.831701 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.851591 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.863725 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.873237 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.884043 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.901185 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.912580 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:46 crc kubenswrapper[4824]: I0224 00:07:46.922327 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:47 crc kubenswrapper[4824]: I0224 00:07:47.693401 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:47 crc kubenswrapper[4824]: E0224 00:07:47.693640 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:47 crc kubenswrapper[4824]: I0224 00:07:47.727873 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 00:44:10.66885325 +0000 UTC Feb 24 00:07:48 crc kubenswrapper[4824]: I0224 00:07:48.692947 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:48 crc kubenswrapper[4824]: E0224 00:07:48.693130 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:48 crc kubenswrapper[4824]: I0224 00:07:48.693384 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:48 crc kubenswrapper[4824]: E0224 00:07:48.693463 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:48 crc kubenswrapper[4824]: I0224 00:07:48.693628 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:48 crc kubenswrapper[4824]: E0224 00:07:48.693696 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:48 crc kubenswrapper[4824]: I0224 00:07:48.728394 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 07:01:21.213513807 +0000 UTC Feb 24 00:07:49 crc kubenswrapper[4824]: I0224 00:07:49.692973 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:49 crc kubenswrapper[4824]: E0224 00:07:49.693472 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:49 crc kubenswrapper[4824]: I0224 00:07:49.729837 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:58:18.636463878 +0000 UTC Feb 24 00:07:50 crc kubenswrapper[4824]: I0224 00:07:50.693724 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:50 crc kubenswrapper[4824]: I0224 00:07:50.693804 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:50 crc kubenswrapper[4824]: I0224 00:07:50.693887 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:50 crc kubenswrapper[4824]: E0224 00:07:50.693981 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:50 crc kubenswrapper[4824]: E0224 00:07:50.694565 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:50 crc kubenswrapper[4824]: E0224 00:07:50.694882 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:50 crc kubenswrapper[4824]: I0224 00:07:50.730024 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 06:42:41.104037648 +0000 UTC Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.693425 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.694598 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.695086 4824 scope.go:117] "RemoveContainer" containerID="9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.730194 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 07:11:05.373353708 +0000 UTC Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.777425 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.849737 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.850165 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.850440 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.850696 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.850923 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:51Z","lastTransitionTime":"2026-02-24T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.869289 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.874981 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.875028 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.875040 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.875060 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.875075 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:51Z","lastTransitionTime":"2026-02-24T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.892775 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.897950 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.898201 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.898298 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.898444 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.898589 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:51Z","lastTransitionTime":"2026-02-24T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.922046 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.927997 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.928056 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.928079 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.928104 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.928122 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:51Z","lastTransitionTime":"2026-02-24T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.945388 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.951202 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.951255 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.951273 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.951297 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:07:51 crc kubenswrapper[4824]: I0224 00:07:51.951312 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:07:51Z","lastTransitionTime":"2026-02-24T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.965340 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:51 crc kubenswrapper[4824]: E0224 00:07:51.965811 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.395480 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/1.log" Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.398794 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba"} Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.439516 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:52 crc kubenswrapper[4824]: E0224 00:07:52.439735 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:52 crc kubenswrapper[4824]: E0224 00:07:52.439866 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:08.439833321 +0000 UTC m=+152.429457800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.693172 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.693237 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:52 crc kubenswrapper[4824]: E0224 00:07:52.693323 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.693559 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:52 crc kubenswrapper[4824]: E0224 00:07:52.693557 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:52 crc kubenswrapper[4824]: E0224 00:07:52.693605 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.705244 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 24 00:07:52 crc kubenswrapper[4824]: I0224 00:07:52.731388 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 16:27:01.63347539 +0000 UTC Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.403920 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/2.log" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.404426 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/1.log" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.406913 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" exitCode=1 Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.407017 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba"} Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.407114 4824 scope.go:117] "RemoveContainer" containerID="9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.407622 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:07:53 crc kubenswrapper[4824]: E0224 00:07:53.407749 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.427020 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.442040 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.489565 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.507868 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.529780 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.545469 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.555203 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.570175 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.583511 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.597556 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.608364 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.622843 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.636058 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.650298 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.664507 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.677221 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.684351 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d676f945e46c0757bafb1cce26c0a049ef7be697902fb79cb2fc18454a13952\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:34Z\\\",\\\"message\\\":\\\"00:07:34.252477 6701 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 00:07:34.258107 6701 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 00:07:34.258158 6701 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 00:07:34.258177 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 00:07:34.258183 6701 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0224 00:07:34.258224 6701 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 00:07:34.258229 6701 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 00:07:34.258261 6701 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 00:07:34.258290 6701 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 00:07:34.258298 6701 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0224 00:07:34.258306 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 00:07:34.258312 6701 handler.go:208] Removed *v1.Node event handler 7\\\\nI0224 00:07:34.258320 6701 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 00:07:34.260056 6701 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 00:07:34.260124 6701 factory.go:656] Stopping watch factory\\\\nI0224 00:07:34.260149 6701 ovnkube.go:599] Stopped ovnkube\\\\nI0224 00:07:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.693457 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:53 crc kubenswrapper[4824]: E0224 00:07:53.693719 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:53 crc kubenswrapper[4824]: I0224 00:07:53.731975 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 08:56:06.311382255 +0000 UTC Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.414290 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/2.log" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.420746 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:07:54 crc kubenswrapper[4824]: E0224 00:07:54.421080 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.443816 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.462259 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.484936 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.502793 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.523984 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.543986 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.574633 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.596542 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.613617 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.634934 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.653647 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.666116 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.679231 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.693865 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:54 crc kubenswrapper[4824]: E0224 00:07:54.694473 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.694013 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:54 crc kubenswrapper[4824]: E0224 00:07:54.694581 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.693877 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:54 crc kubenswrapper[4824]: E0224 00:07:54.694651 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.700741 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.720340 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.733631 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 19:53:44.96743377 +0000 UTC Feb 24 00:07:54 crc kubenswrapper[4824]: I0224 00:07:54.734710 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:55 crc kubenswrapper[4824]: I0224 00:07:55.425073 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:07:55 crc kubenswrapper[4824]: E0224 00:07:55.425776 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:07:55 crc kubenswrapper[4824]: I0224 00:07:55.692818 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:55 crc kubenswrapper[4824]: E0224 00:07:55.692980 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:55 crc kubenswrapper[4824]: I0224 00:07:55.734214 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:20:53.828930348 +0000 UTC Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.693998 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.694134 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.695014 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:56 crc kubenswrapper[4824]: E0224 00:07:56.695309 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:56 crc kubenswrapper[4824]: E0224 00:07:56.695452 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:56 crc kubenswrapper[4824]: E0224 00:07:56.695675 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.713166 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.725789 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.735343 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 10:10:44.618526587 +0000 UTC Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.743249 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.760961 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: E0224 00:07:56.778656 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.779869 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.793314 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.805457 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.817651 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.830384 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.846081 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.857736 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.869409 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.881288 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.900994 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.919714 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.933593 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:56 crc kubenswrapper[4824]: I0224 00:07:56.945977 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 24 00:07:57 crc kubenswrapper[4824]: I0224 00:07:57.693705 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:57 crc kubenswrapper[4824]: E0224 00:07:57.693910 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:57 crc kubenswrapper[4824]: I0224 00:07:57.736057 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 07:41:53.129771298 +0000 UTC Feb 24 00:07:58 crc kubenswrapper[4824]: I0224 00:07:58.693294 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:07:58 crc kubenswrapper[4824]: E0224 00:07:58.693438 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:07:58 crc kubenswrapper[4824]: I0224 00:07:58.693298 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:07:58 crc kubenswrapper[4824]: E0224 00:07:58.693674 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:07:58 crc kubenswrapper[4824]: I0224 00:07:58.693875 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:07:58 crc kubenswrapper[4824]: E0224 00:07:58.693942 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:07:58 crc kubenswrapper[4824]: I0224 00:07:58.736557 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:12:15.611325744 +0000 UTC Feb 24 00:07:59 crc kubenswrapper[4824]: I0224 00:07:59.692715 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:07:59 crc kubenswrapper[4824]: E0224 00:07:59.692898 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:07:59 crc kubenswrapper[4824]: I0224 00:07:59.737656 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:47:37.308748108 +0000 UTC Feb 24 00:08:00 crc kubenswrapper[4824]: I0224 00:08:00.693092 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:00 crc kubenswrapper[4824]: I0224 00:08:00.693203 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:00 crc kubenswrapper[4824]: E0224 00:08:00.693250 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:00 crc kubenswrapper[4824]: E0224 00:08:00.693367 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:00 crc kubenswrapper[4824]: I0224 00:08:00.693092 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:00 crc kubenswrapper[4824]: E0224 00:08:00.693696 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:00 crc kubenswrapper[4824]: I0224 00:08:00.738657 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 15:33:30.245790217 +0000 UTC Feb 24 00:08:01 crc kubenswrapper[4824]: I0224 00:08:01.693281 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:01 crc kubenswrapper[4824]: E0224 00:08:01.693624 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:01 crc kubenswrapper[4824]: I0224 00:08:01.704112 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 24 00:08:01 crc kubenswrapper[4824]: I0224 00:08:01.739400 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:23:56.344739211 +0000 UTC Feb 24 00:08:01 crc kubenswrapper[4824]: E0224 00:08:01.780595 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.196448 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.196494 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.196504 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.196540 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.196561 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:02Z","lastTransitionTime":"2026-02-24T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.213535 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.217857 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.217915 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.217933 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.217955 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.217973 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:02Z","lastTransitionTime":"2026-02-24T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.237097 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.241826 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.241878 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.241890 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.241908 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.241920 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:02Z","lastTransitionTime":"2026-02-24T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.257413 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.262562 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.262612 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.262622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.262640 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.262652 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:02Z","lastTransitionTime":"2026-02-24T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.277110 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.282018 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.282063 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.282073 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.282090 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.282101 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:02Z","lastTransitionTime":"2026-02-24T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.295356 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.295465 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.693831 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.694025 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.694658 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.694796 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.694870 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:02 crc kubenswrapper[4824]: E0224 00:08:02.695031 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:02 crc kubenswrapper[4824]: I0224 00:08:02.756148 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:25:05.65268619 +0000 UTC Feb 24 00:08:03 crc kubenswrapper[4824]: I0224 00:08:03.693390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:03 crc kubenswrapper[4824]: E0224 00:08:03.693913 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:03 crc kubenswrapper[4824]: I0224 00:08:03.756946 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 08:56:00.693843529 +0000 UTC Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.685205 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.685384 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:08.685354711 +0000 UTC m=+212.674979190 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.685975 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.686140 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.686446 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.686680 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.686282 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.686939 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687120 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.686383 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687321 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687356 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.686632 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.686824 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687247 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:09:08.687234543 +0000 UTC m=+212.676859012 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687568 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:09:08.68749083 +0000 UTC m=+212.677115339 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687609 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:09:08.687591653 +0000 UTC m=+212.677216242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.687644 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:09:08.687626034 +0000 UTC m=+212.677250653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.693147 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.693203 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.693161 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.693411 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.693493 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:04 crc kubenswrapper[4824]: E0224 00:08:04.693595 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:04 crc kubenswrapper[4824]: I0224 00:08:04.757493 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 17:25:55.777963944 +0000 UTC Feb 24 00:08:05 crc kubenswrapper[4824]: I0224 00:08:05.693075 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:05 crc kubenswrapper[4824]: E0224 00:08:05.693230 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:05 crc kubenswrapper[4824]: I0224 00:08:05.758769 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 19:32:27.767050168 +0000 UTC Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.693037 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.693144 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:06 crc kubenswrapper[4824]: E0224 00:08:06.693253 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:06 crc kubenswrapper[4824]: E0224 00:08:06.694175 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.694182 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:06 crc kubenswrapper[4824]: E0224 00:08:06.694343 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.696443 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:08:06 crc kubenswrapper[4824]: E0224 00:08:06.697613 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.721353 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.745201 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.759321 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 20:10:48.062486642 +0000 UTC Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.762558 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.779714 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: E0224 00:08:06.781713 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.796155 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.810478 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.829772 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.844089 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.859204 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.876695 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.900868 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.919404 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.936299 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.951072 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.970749 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.982384 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:06 crc kubenswrapper[4824]: I0224 00:08:06.995883 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:06Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:07 crc kubenswrapper[4824]: I0224 00:08:07.012277 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:07Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:07 crc kubenswrapper[4824]: I0224 00:08:07.693393 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:07 crc kubenswrapper[4824]: E0224 00:08:07.693612 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:07 crc kubenswrapper[4824]: I0224 00:08:07.759921 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 07:43:33.380567384 +0000 UTC Feb 24 00:08:08 crc kubenswrapper[4824]: I0224 00:08:08.532461 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:08 crc kubenswrapper[4824]: E0224 00:08:08.532656 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:08:08 crc kubenswrapper[4824]: E0224 00:08:08.532723 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:08:40.532706575 +0000 UTC m=+184.522331034 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:08:08 crc kubenswrapper[4824]: I0224 00:08:08.693292 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:08 crc kubenswrapper[4824]: I0224 00:08:08.693367 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:08 crc kubenswrapper[4824]: I0224 00:08:08.693390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:08 crc kubenswrapper[4824]: E0224 00:08:08.693460 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:08 crc kubenswrapper[4824]: E0224 00:08:08.693645 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:08 crc kubenswrapper[4824]: E0224 00:08:08.693791 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:08 crc kubenswrapper[4824]: I0224 00:08:08.760840 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 09:02:03.311170118 +0000 UTC Feb 24 00:08:09 crc kubenswrapper[4824]: I0224 00:08:09.693092 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:09 crc kubenswrapper[4824]: E0224 00:08:09.693293 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:09 crc kubenswrapper[4824]: I0224 00:08:09.761780 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 04:58:50.995611414 +0000 UTC Feb 24 00:08:10 crc kubenswrapper[4824]: I0224 00:08:10.693909 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:10 crc kubenswrapper[4824]: I0224 00:08:10.694003 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:10 crc kubenswrapper[4824]: I0224 00:08:10.693910 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:10 crc kubenswrapper[4824]: E0224 00:08:10.694923 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:10 crc kubenswrapper[4824]: E0224 00:08:10.695050 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:10 crc kubenswrapper[4824]: E0224 00:08:10.695102 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:10 crc kubenswrapper[4824]: I0224 00:08:10.711012 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 24 00:08:10 crc kubenswrapper[4824]: I0224 00:08:10.762742 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 23:13:43.047161668 +0000 UTC Feb 24 00:08:11 crc kubenswrapper[4824]: I0224 00:08:11.692763 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:11 crc kubenswrapper[4824]: E0224 00:08:11.693306 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:11 crc kubenswrapper[4824]: I0224 00:08:11.763247 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:51:59.134108015 +0000 UTC Feb 24 00:08:11 crc kubenswrapper[4824]: E0224 00:08:11.783374 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.343132 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.343542 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.343633 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.343729 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.343812 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:12Z","lastTransitionTime":"2026-02-24T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.361082 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.365511 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.365600 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.365609 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.365627 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.365657 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:12Z","lastTransitionTime":"2026-02-24T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.383943 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.388672 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.388719 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.388731 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.388775 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.388790 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:12Z","lastTransitionTime":"2026-02-24T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.405194 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.409936 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.409963 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.409972 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.409988 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.409997 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:12Z","lastTransitionTime":"2026-02-24T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.425259 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.431077 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.431129 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.431141 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.431163 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.431175 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:12Z","lastTransitionTime":"2026-02-24T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.452638 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.452839 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.486074 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/0.log" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.486165 4824 generic.go:334] "Generic (PLEG): container finished" podID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" containerID="4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d" exitCode=1 Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.486221 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerDied","Data":"4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d"} Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.487016 4824 scope.go:117] "RemoveContainer" containerID="4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.505680 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.527944 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.541661 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.554741 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.568092 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.584030 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.598452 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.616047 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.631170 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.644944 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.658441 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.675535 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.689319 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.692998 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.693112 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.693281 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.693339 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.693511 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:12 crc kubenswrapper[4824]: E0224 00:08:12.693740 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.703302 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.718682 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.742471 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.761869 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.763984 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 04:04:26.435515296 +0000 UTC Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.780865 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:12 crc kubenswrapper[4824]: I0224 00:08:12.798338 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.493660 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/0.log" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.493743 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerStarted","Data":"a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06"} Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.515321 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.532106 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.549314 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.565739 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.582292 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.600256 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.613256 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.627697 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.641283 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.658568 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.678212 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.693984 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:13 crc kubenswrapper[4824]: E0224 00:08:13.694156 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.700703 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.717052 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.744083 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.765177 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 05:55:08.561536232 +0000 UTC Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.765435 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.783463 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.802833 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.822444 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:13 crc kubenswrapper[4824]: I0224 00:08:13.845125 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:14 crc kubenswrapper[4824]: I0224 00:08:14.693851 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:14 crc kubenswrapper[4824]: E0224 00:08:14.694051 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:14 crc kubenswrapper[4824]: I0224 00:08:14.693877 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:14 crc kubenswrapper[4824]: I0224 00:08:14.694187 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:14 crc kubenswrapper[4824]: E0224 00:08:14.694351 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:14 crc kubenswrapper[4824]: E0224 00:08:14.694433 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:14 crc kubenswrapper[4824]: I0224 00:08:14.765657 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 04:26:57.43370022 +0000 UTC Feb 24 00:08:15 crc kubenswrapper[4824]: I0224 00:08:15.693747 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:15 crc kubenswrapper[4824]: E0224 00:08:15.693955 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:15 crc kubenswrapper[4824]: I0224 00:08:15.766158 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 05:04:49.336824986 +0000 UTC Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.692776 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.692808 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.692835 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:16 crc kubenswrapper[4824]: E0224 00:08:16.693648 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:16 crc kubenswrapper[4824]: E0224 00:08:16.693761 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:16 crc kubenswrapper[4824]: E0224 00:08:16.693854 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.708097 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.720510 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.738180 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.753040 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.766809 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 10:50:56.061301222 +0000 UTC Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.768746 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: E0224 00:08:16.784204 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.789251 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.814516 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.835557 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.850737 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.863552 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.876103 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.890591 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.903858 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.916783 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.932049 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.945901 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.958362 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.974165 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:16 crc kubenswrapper[4824]: I0224 00:08:16.990701 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:17 crc kubenswrapper[4824]: I0224 00:08:17.693617 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:17 crc kubenswrapper[4824]: E0224 00:08:17.693835 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:17 crc kubenswrapper[4824]: I0224 00:08:17.767067 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:32:04.689457131 +0000 UTC Feb 24 00:08:18 crc kubenswrapper[4824]: I0224 00:08:18.693760 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:18 crc kubenswrapper[4824]: I0224 00:08:18.693906 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:18 crc kubenswrapper[4824]: I0224 00:08:18.694387 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:18 crc kubenswrapper[4824]: E0224 00:08:18.694581 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:18 crc kubenswrapper[4824]: E0224 00:08:18.694648 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:18 crc kubenswrapper[4824]: E0224 00:08:18.694691 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:18 crc kubenswrapper[4824]: I0224 00:08:18.695165 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:08:18 crc kubenswrapper[4824]: I0224 00:08:18.767946 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 23:16:59.566175688 +0000 UTC Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.515954 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/2.log" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.518084 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.519202 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.534228 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.549614 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.564681 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.576175 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.587726 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.601571 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.614618 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.638799 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.653921 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.672585 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.693014 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:19 crc kubenswrapper[4824]: E0224 00:08:19.693155 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.693125 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.707390 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.719096 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.731218 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.751467 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.764280 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.768139 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 10:58:36.974345751 +0000 UTC Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.776632 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.789588 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:19 crc kubenswrapper[4824]: I0224 00:08:19.801905 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:19Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.524004 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/3.log" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.525219 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/2.log" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.529468 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" exitCode=1 Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.529562 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.529618 4824 scope.go:117] "RemoveContainer" containerID="76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.530774 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:08:20 crc kubenswrapper[4824]: E0224 00:08:20.531169 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.550700 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.568739 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.583457 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.600611 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.619794 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.635307 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.650454 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.667709 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.691365 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.693729 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.693844 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.693925 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:20 crc kubenswrapper[4824]: E0224 00:08:20.693862 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:20 crc kubenswrapper[4824]: E0224 00:08:20.694127 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:20 crc kubenswrapper[4824]: E0224 00:08:20.694201 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.707383 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.725729 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.740131 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.755293 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.769097 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:08:58.740155317 +0000 UTC Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.779714 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.797451 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.815456 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.831658 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.853509 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:20 crc kubenswrapper[4824]: I0224 00:08:20.874880 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76341c8e4fd3e6aaeb74b20155da61acd373c362ebccb81a83847b8aa813c8ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:07:53Z\\\",\\\"message\\\":\\\"61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0224 00:07:53.258981 6959 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-98z42] creating logical port openshift-multus_network-metrics-daemon-98z42 for pod on switch crc\\\\nI0224 00:07:53.258997 6959 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh in node crc\\\\nI0224 00:07:53.259010 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh after 0 failed attempt(s)\\\\nI0224 00:07:53.259021 6959 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh\\\\nI0224 00:07:53.258757 6959 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-vcbgn in node crc\\\\nI0224 00:07:53.259038 6959 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-vcbgn after 0 failed attempt(s)\\\\nI0224 00:07:53.259043 6959 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-vcbgn\\\\nI0224 00:07:53.259043 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:19Z\\\",\\\"message\\\":\\\"emplate:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0224 00:08:19.714118 7289 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0224 00:08:19.714202 7289 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:20Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.535295 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/3.log" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.538701 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:08:21 crc kubenswrapper[4824]: E0224 00:08:21.538895 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.557110 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.568602 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.579924 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.594561 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.609379 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.621241 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.633993 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.647472 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.660761 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.672279 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.687933 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.692758 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:21 crc kubenswrapper[4824]: E0224 00:08:21.692930 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.700675 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.713796 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.732075 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.751362 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:19Z\\\",\\\"message\\\":\\\"emplate:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0224 00:08:19.714118 7289 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0224 00:08:19.714202 7289 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.769685 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 04:07:30.45471987 +0000 UTC Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.770180 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.784750 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: E0224 00:08:21.785408 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.803039 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:21 crc kubenswrapper[4824]: I0224 00:08:21.821150 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:21Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.693104 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.693187 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.693187 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.693326 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.693479 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.693632 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.770367 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:49:59.805144189 +0000 UTC Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.819829 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.819861 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.819872 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.819886 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.819894 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:22Z","lastTransitionTime":"2026-02-24T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.832919 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.837225 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.837258 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.837266 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.837282 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.837292 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:22Z","lastTransitionTime":"2026-02-24T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.854903 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.858193 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.858237 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.858251 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.858274 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.858289 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:22Z","lastTransitionTime":"2026-02-24T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.871865 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.876137 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.876181 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.876194 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.876213 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.876225 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:22Z","lastTransitionTime":"2026-02-24T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.893661 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.897313 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.897354 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.897365 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.897383 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:22 crc kubenswrapper[4824]: I0224 00:08:22.897393 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:22Z","lastTransitionTime":"2026-02-24T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.911067 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:22 crc kubenswrapper[4824]: E0224 00:08:22.911181 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:08:23 crc kubenswrapper[4824]: I0224 00:08:23.693389 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:23 crc kubenswrapper[4824]: E0224 00:08:23.693629 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:23 crc kubenswrapper[4824]: I0224 00:08:23.771635 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 21:01:22.567733481 +0000 UTC Feb 24 00:08:24 crc kubenswrapper[4824]: I0224 00:08:24.693625 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:24 crc kubenswrapper[4824]: I0224 00:08:24.693638 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:24 crc kubenswrapper[4824]: E0224 00:08:24.694225 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:24 crc kubenswrapper[4824]: I0224 00:08:24.693696 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:24 crc kubenswrapper[4824]: E0224 00:08:24.694351 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:24 crc kubenswrapper[4824]: E0224 00:08:24.694424 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:24 crc kubenswrapper[4824]: I0224 00:08:24.772580 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 11:38:23.97220231 +0000 UTC Feb 24 00:08:25 crc kubenswrapper[4824]: I0224 00:08:25.693388 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:25 crc kubenswrapper[4824]: E0224 00:08:25.693587 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:25 crc kubenswrapper[4824]: I0224 00:08:25.773137 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 02:34:53.285288771 +0000 UTC Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.693839 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.693932 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:26 crc kubenswrapper[4824]: E0224 00:08:26.694067 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:26 crc kubenswrapper[4824]: E0224 00:08:26.694192 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.693857 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:26 crc kubenswrapper[4824]: E0224 00:08:26.694865 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.719058 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23aab44a-c0da-4344-b8b1-7c754b00d6d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65bbd80285850cb217ec0994ab8841efb660fd2488077ee4968b0c1f5d156fbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7244717b123b325ac83f68b976c1e5761a76b1aacac6aab471d0b644386251d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5ce8e051c69710768e7b4fba06cc91026ce4d25baf35cd8f9235bfd348a4451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://837b5de339793bc363199eae6f141038bf069b2832f6375a8e01d49bef7ea63d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50ede0889b30698004a8f288b9cc1bb7d00b194c21ff1936b2743f1e7f246e6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9c6e044265812e8e2116a81efcc38023cc8da2b92093e3f75c6efa4e359c4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d404a30b7d0515ae872dab9ec3e60629de532d6f27ab0c9f1e7e3918bad30e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://513b48315443b87dc828225545ae02e1f47580596fc31a7935dd59b80065d26c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.737875 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5209eb40-b077-42f2-9239-27bf1cde0e05\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:36Z\\\",\\\"message\\\":\\\"W0224 00:06:35.979734 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0224 00:06:35.980117 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771891595 cert, and key in /tmp/serving-cert-1325703996/serving-signer.crt, /tmp/serving-cert-1325703996/serving-signer.key\\\\nI0224 00:06:36.313594 1 observer_polling.go:159] Starting file observer\\\\nW0224 00:06:36.323413 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0224 00:06:36.325040 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 00:06:36.326853 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1325703996/tls.crt::/tmp/serving-cert-1325703996/tls.key\\\\\\\"\\\\nF0224 00:06:36.787238 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.754711 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://684e1bbbe61030fdd75db917dbf9ac4fedf392502a653c8f4600dc829410aa95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://983e13c7b9d6d0b8183250aedfe93ec3ac42de4dae305c03caa9f653082bb329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.775665 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:47:10.341864609 +0000 UTC Feb 24 00:08:26 crc kubenswrapper[4824]: E0224 00:08:26.786059 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.797853 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.821795 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wvqfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:11Z\\\",\\\"message\\\":\\\"2026-02-24T00:07:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016\\\\n2026-02-24T00:07:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_61fbc54a-734f-43d4-9941-0e38fbccf016 to /host/opt/cni/bin/\\\\n2026-02-24T00:07:26Z [verbose] multus-daemon started\\\\n2026-02-24T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-02-24T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wvqfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.847334 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d985b875-dd5e-4767-a4e2-209894575a8f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T00:08:19Z\\\",\\\"message\\\":\\\"emplate:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0224 00:08:19.714118 7289 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0224 00:08:19.714202 7289 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6rnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xjg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.867988 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0525cd89-44e0-47f1-856c-f566eb21596a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391a3f09eb431d4cc4af8eba60855591dd9fe90774ac0ecbf8ed112d41146fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f1d96e1625c71100e773b21982fde152e58d48d96841da879d6ab878342b1e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5smpb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dbxhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.880949 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56aba328-bba3-447a-9fc1-9ed04311acf5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93bd599f0dec3d074b172aab55b3b74283b37d4dddc2e6a97b7319e8d7bd15b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357f08b7a73fd26e3a3c8232ff5641ec70da20551310ce9348d3b1462a30735c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d213e230e92cadfb65d0efd8b4dea058e9f947897ad48c9c9d123ed37263ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1580132fb7d7b9203017c471a27257d03282c1b3dd97167f647ecd3848ba69a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.893198 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b147bde27310be40a9d49e2254ff858c03a32602bad1de273272b7b3292f057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.904682 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwxht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea7ca84e-cb8c-4bdf-ba10-bd2aca09bdaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69860166bff170380f7baa9e9b405900881403e17fbb017db27e9136e1c3696e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lxc9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwxht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.917773 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"939ca085-9383-42e6-b7d6-37f101137273\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2cc64aac8c2354ba053a22c4c5cde7371a99c7aaaece1fe2dd2f7b41eceb4e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6f4s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vcbgn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.935370 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-d64vq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28309e58-76b2-4fe6-a1e5-569b6f0b3a5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b16a40794ba6c7bfbe672f5438284c9479efc8d1fe6a87b8292b3595799772e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://002f92646d054b312160a3b59900255e35829570bf3dbef9c871b6772f3628f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7a337a9d263f11a833a233e7ad735468a159be0fe3a62e21d72ae816355a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69c7a6d07cb824c7aa646720b7d0a6563a3fe4de8af9bfcf85b8a4093df9154\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cd5818c073a3dabff4b4b5d2cc02a40b8a722c642730ea29e07f840db9afc50\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://43d152897b22f88ea3328011ed0b921e0f33f5c0375f4af98b424da48c20a268\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86a454258a9043824fa0ce318ad33be2fc4c029f75f0fc489b22716d69d67d45\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ttlvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-d64vq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.947069 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2zsq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bffd69c2-56a8-4fa0-9fbf-82a508f80ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4215a2974f4f4d1ab7b093c8d457f1f450276c89565a95ea13f1d42db9f07afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkj25\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2zsq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.960737 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc53949c-a6a4-48c1-a312-cc8d39a3238f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://706813f751a53e8e88d84668245bb4b94bc39d6b611f0fed9f774c226dc8c632\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fe1cb1e114c1440630a33e3dd127af3600c520e37148cc01ab2e4ed346941ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T00:06:33Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 00:06:03.976490 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 00:06:03.977317 1 observer_polling.go:159] Starting file observer\\\\nI0224 00:06:03.977962 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 00:06:03.978570 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 00:06:33.531382 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 00:06:33.531507 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:06:33Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T00:06:03Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf86745569172505df6632421bd1587317cb06e26e70937563bd7d0341c6086\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://213ad3b341ace3f4473aa48ecaaa41814fe670417431f6d5cd04be03482e597c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.975443 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc76b60d9c2e639d3340a172e70670a434008774f3ec5751dae2ffc78a913486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:26 crc kubenswrapper[4824]: I0224 00:08:26.986552 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17795ce3-5917-4f02-a513-de53b7c702cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:05:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdd198d5e4adeaa8e23f5262bc17476b82b1c76b3bfd06b385590eede8c0baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T00:05:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370e38e80ea6eb0a1add2e84d515cef4e8aeb77dc43beba4d0c3fdeb871f4fb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T00:05:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T00:05:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:05:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:27 crc kubenswrapper[4824]: I0224 00:08:27.000868 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:26Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:27 crc kubenswrapper[4824]: I0224 00:08:27.014306 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:27 crc kubenswrapper[4824]: I0224 00:08:27.025843 4824 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-98z42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a648113f-3e46-4170-ba30-7155fefbb413\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svh6v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T00:07:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-98z42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:27Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:27 crc kubenswrapper[4824]: I0224 00:08:27.693618 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:27 crc kubenswrapper[4824]: E0224 00:08:27.693769 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:27 crc kubenswrapper[4824]: I0224 00:08:27.776356 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 17:26:08.893577574 +0000 UTC Feb 24 00:08:28 crc kubenswrapper[4824]: I0224 00:08:28.693881 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:28 crc kubenswrapper[4824]: I0224 00:08:28.693979 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:28 crc kubenswrapper[4824]: I0224 00:08:28.693910 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:28 crc kubenswrapper[4824]: E0224 00:08:28.694174 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:28 crc kubenswrapper[4824]: E0224 00:08:28.694266 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:28 crc kubenswrapper[4824]: E0224 00:08:28.694408 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:28 crc kubenswrapper[4824]: I0224 00:08:28.776595 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 10:47:20.482676861 +0000 UTC Feb 24 00:08:29 crc kubenswrapper[4824]: I0224 00:08:29.693598 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:29 crc kubenswrapper[4824]: E0224 00:08:29.693768 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:29 crc kubenswrapper[4824]: I0224 00:08:29.777539 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:14:48.631276644 +0000 UTC Feb 24 00:08:30 crc kubenswrapper[4824]: I0224 00:08:30.693336 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:30 crc kubenswrapper[4824]: I0224 00:08:30.693431 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:30 crc kubenswrapper[4824]: I0224 00:08:30.693501 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:30 crc kubenswrapper[4824]: E0224 00:08:30.693903 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:30 crc kubenswrapper[4824]: E0224 00:08:30.694149 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:30 crc kubenswrapper[4824]: E0224 00:08:30.694386 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:30 crc kubenswrapper[4824]: I0224 00:08:30.778490 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 18:21:49.045162605 +0000 UTC Feb 24 00:08:31 crc kubenswrapper[4824]: I0224 00:08:31.693094 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:31 crc kubenswrapper[4824]: E0224 00:08:31.693346 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:31 crc kubenswrapper[4824]: I0224 00:08:31.779157 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 21:32:35.110024379 +0000 UTC Feb 24 00:08:31 crc kubenswrapper[4824]: E0224 00:08:31.787178 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:32 crc kubenswrapper[4824]: I0224 00:08:32.694215 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:32 crc kubenswrapper[4824]: E0224 00:08:32.694381 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:32 crc kubenswrapper[4824]: I0224 00:08:32.694290 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:32 crc kubenswrapper[4824]: E0224 00:08:32.694456 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:32 crc kubenswrapper[4824]: I0224 00:08:32.694277 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:32 crc kubenswrapper[4824]: E0224 00:08:32.694512 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:32 crc kubenswrapper[4824]: I0224 00:08:32.780215 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 10:10:06.550306153 +0000 UTC Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.148362 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.148404 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.148414 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.148435 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.148446 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:33Z","lastTransitionTime":"2026-02-24T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.168213 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.176784 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.176857 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.176990 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.177804 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.177823 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:33Z","lastTransitionTime":"2026-02-24T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.193949 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.198551 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.198591 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.198603 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.198622 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.198634 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:33Z","lastTransitionTime":"2026-02-24T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.214000 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.217533 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.217574 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.217583 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.217602 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.217613 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:33Z","lastTransitionTime":"2026-02-24T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.231767 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.236663 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.236698 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.236708 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.236727 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.236743 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:33Z","lastTransitionTime":"2026-02-24T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.250899 4824 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7ea41d01-04ab-44da-af10-993e94777268\\\",\\\"systemUUID\\\":\\\"d5e3d68d-d538-4dbe-b3fe-7347ab36b29a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T00:08:33Z is after 2025-08-24T17:21:41Z" Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.251049 4824 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.693165 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:33 crc kubenswrapper[4824]: E0224 00:08:33.693416 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:33 crc kubenswrapper[4824]: I0224 00:08:33.780883 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:57:54.168614833 +0000 UTC Feb 24 00:08:34 crc kubenswrapper[4824]: I0224 00:08:34.693146 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:34 crc kubenswrapper[4824]: I0224 00:08:34.693174 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:34 crc kubenswrapper[4824]: I0224 00:08:34.693705 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:34 crc kubenswrapper[4824]: E0224 00:08:34.693911 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:34 crc kubenswrapper[4824]: E0224 00:08:34.693991 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:34 crc kubenswrapper[4824]: E0224 00:08:34.694130 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:34 crc kubenswrapper[4824]: I0224 00:08:34.694452 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:08:34 crc kubenswrapper[4824]: E0224 00:08:34.694941 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:08:34 crc kubenswrapper[4824]: I0224 00:08:34.781591 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 05:22:23.613055346 +0000 UTC Feb 24 00:08:35 crc kubenswrapper[4824]: I0224 00:08:35.693577 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:35 crc kubenswrapper[4824]: E0224 00:08:35.693918 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:35 crc kubenswrapper[4824]: I0224 00:08:35.781838 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:25:17.474643412 +0000 UTC Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.693453 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:36 crc kubenswrapper[4824]: E0224 00:08:36.693774 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.693900 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.694077 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:36 crc kubenswrapper[4824]: E0224 00:08:36.694668 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:36 crc kubenswrapper[4824]: E0224 00:08:36.694926 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.730892 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=35.730860675 podStartE2EDuration="35.730860675s" podCreationTimestamp="2026-02-24 00:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.730566657 +0000 UTC m=+180.720191156" watchObservedRunningTime="2026-02-24 00:08:36.730860675 +0000 UTC m=+180.720485184" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.782824 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 02:03:02.927029489 +0000 UTC Feb 24 00:08:36 crc kubenswrapper[4824]: E0224 00:08:36.787695 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.813032 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=26.812995875 podStartE2EDuration="26.812995875s" podCreationTimestamp="2026-02-24 00:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.810727802 +0000 UTC m=+180.800352291" watchObservedRunningTime="2026-02-24 00:08:36.812995875 +0000 UTC m=+180.802620364" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.834766 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.8347431 podStartE2EDuration="1m29.8347431s" podCreationTimestamp="2026-02-24 00:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.833506556 +0000 UTC m=+180.823131025" watchObservedRunningTime="2026-02-24 00:08:36.8347431 +0000 UTC m=+180.824367569" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.914320 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wvqfl" podStartSLOduration=106.914297149 podStartE2EDuration="1m46.914297149s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.882243551 +0000 UTC m=+180.871868060" watchObservedRunningTime="2026-02-24 00:08:36.914297149 +0000 UTC m=+180.903921618" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.931173 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dbxhh" podStartSLOduration=106.931144471 podStartE2EDuration="1m46.931144471s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.930051841 +0000 UTC m=+180.919676310" watchObservedRunningTime="2026-02-24 00:08:36.931144471 +0000 UTC m=+180.920768950" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.964168 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.964145494 podStartE2EDuration="44.964145494s" podCreationTimestamp="2026-02-24 00:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.948537557 +0000 UTC m=+180.938162046" watchObservedRunningTime="2026-02-24 00:08:36.964145494 +0000 UTC m=+180.953769973" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.977016 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nwxht" podStartSLOduration=107.976995836 podStartE2EDuration="1m47.976995836s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.976649397 +0000 UTC m=+180.966273896" watchObservedRunningTime="2026-02-24 00:08:36.976995836 +0000 UTC m=+180.966620315" Feb 24 00:08:36 crc kubenswrapper[4824]: I0224 00:08:36.991231 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podStartSLOduration=106.991207906 podStartE2EDuration="1m46.991207906s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:36.991149274 +0000 UTC m=+180.980773753" watchObservedRunningTime="2026-02-24 00:08:36.991207906 +0000 UTC m=+180.980832365" Feb 24 00:08:37 crc kubenswrapper[4824]: I0224 00:08:37.012445 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d64vq" podStartSLOduration=107.012425867 podStartE2EDuration="1m47.012425867s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:37.011969294 +0000 UTC m=+181.001593783" watchObservedRunningTime="2026-02-24 00:08:37.012425867 +0000 UTC m=+181.002050336" Feb 24 00:08:37 crc kubenswrapper[4824]: I0224 00:08:37.042794 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2zsq6" podStartSLOduration=108.042766958 podStartE2EDuration="1m48.042766958s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:37.027617343 +0000 UTC m=+181.017241812" watchObservedRunningTime="2026-02-24 00:08:37.042766958 +0000 UTC m=+181.032391427" Feb 24 00:08:37 crc kubenswrapper[4824]: I0224 00:08:37.061096 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=41.061072629 podStartE2EDuration="41.061072629s" podCreationTimestamp="2026-02-24 00:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:37.04393543 +0000 UTC m=+181.033559899" watchObservedRunningTime="2026-02-24 00:08:37.061072629 +0000 UTC m=+181.050697098" Feb 24 00:08:37 crc kubenswrapper[4824]: I0224 00:08:37.692921 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:37 crc kubenswrapper[4824]: E0224 00:08:37.693099 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:37 crc kubenswrapper[4824]: I0224 00:08:37.783641 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:32:21.416339334 +0000 UTC Feb 24 00:08:38 crc kubenswrapper[4824]: I0224 00:08:38.693421 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:38 crc kubenswrapper[4824]: I0224 00:08:38.693441 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:38 crc kubenswrapper[4824]: I0224 00:08:38.693634 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:38 crc kubenswrapper[4824]: E0224 00:08:38.693839 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:38 crc kubenswrapper[4824]: E0224 00:08:38.693993 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:38 crc kubenswrapper[4824]: E0224 00:08:38.694068 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:38 crc kubenswrapper[4824]: I0224 00:08:38.784497 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 08:49:02.046724666 +0000 UTC Feb 24 00:08:39 crc kubenswrapper[4824]: I0224 00:08:39.693177 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:39 crc kubenswrapper[4824]: E0224 00:08:39.693358 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:39 crc kubenswrapper[4824]: I0224 00:08:39.785558 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:25:12.503278456 +0000 UTC Feb 24 00:08:40 crc kubenswrapper[4824]: I0224 00:08:40.597418 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:40 crc kubenswrapper[4824]: E0224 00:08:40.597664 4824 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:08:40 crc kubenswrapper[4824]: E0224 00:08:40.597762 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs podName:a648113f-3e46-4170-ba30-7155fefbb413 nodeName:}" failed. No retries permitted until 2026-02-24 00:09:44.597734963 +0000 UTC m=+248.587359472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs") pod "network-metrics-daemon-98z42" (UID: "a648113f-3e46-4170-ba30-7155fefbb413") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 00:08:40 crc kubenswrapper[4824]: I0224 00:08:40.693862 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:40 crc kubenswrapper[4824]: E0224 00:08:40.694059 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:40 crc kubenswrapper[4824]: I0224 00:08:40.694311 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:40 crc kubenswrapper[4824]: E0224 00:08:40.694433 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:40 crc kubenswrapper[4824]: I0224 00:08:40.694686 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:40 crc kubenswrapper[4824]: E0224 00:08:40.694763 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:40 crc kubenswrapper[4824]: I0224 00:08:40.785783 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 14:05:07.907587907 +0000 UTC Feb 24 00:08:41 crc kubenswrapper[4824]: I0224 00:08:41.693114 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:41 crc kubenswrapper[4824]: E0224 00:08:41.693306 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:41 crc kubenswrapper[4824]: I0224 00:08:41.786217 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:41:56.208868043 +0000 UTC Feb 24 00:08:41 crc kubenswrapper[4824]: E0224 00:08:41.789395 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:42 crc kubenswrapper[4824]: I0224 00:08:42.693315 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:42 crc kubenswrapper[4824]: I0224 00:08:42.693377 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:42 crc kubenswrapper[4824]: E0224 00:08:42.693503 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:42 crc kubenswrapper[4824]: E0224 00:08:42.693621 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:42 crc kubenswrapper[4824]: I0224 00:08:42.693836 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:42 crc kubenswrapper[4824]: E0224 00:08:42.693923 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:42 crc kubenswrapper[4824]: I0224 00:08:42.787450 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:42:30.673417665 +0000 UTC Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.405888 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.405959 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.405972 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.405995 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.406009 4824 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T00:08:43Z","lastTransitionTime":"2026-02-24T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.451361 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r"] Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.452100 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.456165 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.456214 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.456464 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.458639 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.544070 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98c072cc-8e2f-446c-b225-23f6d6e08ffd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.544130 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.544244 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98c072cc-8e2f-446c-b225-23f6d6e08ffd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.544478 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c072cc-8e2f-446c-b225-23f6d6e08ffd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.544626 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645550 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98c072cc-8e2f-446c-b225-23f6d6e08ffd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645640 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645684 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98c072cc-8e2f-446c-b225-23f6d6e08ffd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645724 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c072cc-8e2f-446c-b225-23f6d6e08ffd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645748 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645804 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.645826 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/98c072cc-8e2f-446c-b225-23f6d6e08ffd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.646899 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98c072cc-8e2f-446c-b225-23f6d6e08ffd-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.658069 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c072cc-8e2f-446c-b225-23f6d6e08ffd-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.672124 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/98c072cc-8e2f-446c-b225-23f6d6e08ffd-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qcp8r\" (UID: \"98c072cc-8e2f-446c-b225-23f6d6e08ffd\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.693243 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:43 crc kubenswrapper[4824]: E0224 00:08:43.693697 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.774746 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.788433 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:19:30.045191828 +0000 UTC Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.788908 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 00:08:43 crc kubenswrapper[4824]: I0224 00:08:43.798676 4824 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 00:08:43 crc kubenswrapper[4824]: W0224 00:08:43.800879 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c072cc_8e2f_446c_b225_23f6d6e08ffd.slice/crio-bb6282df6113fc1e7f90312eaf17f161b56b9661ec8064503be27b816caa83e8 WatchSource:0}: Error finding container bb6282df6113fc1e7f90312eaf17f161b56b9661ec8064503be27b816caa83e8: Status 404 returned error can't find the container with id bb6282df6113fc1e7f90312eaf17f161b56b9661ec8064503be27b816caa83e8 Feb 24 00:08:44 crc kubenswrapper[4824]: I0224 00:08:44.617971 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" event={"ID":"98c072cc-8e2f-446c-b225-23f6d6e08ffd","Type":"ContainerStarted","Data":"0c475f8ecfb6b09b7dc0d57991f13c43dd262bf323ce0a90ba820375a27478af"} Feb 24 00:08:44 crc kubenswrapper[4824]: I0224 00:08:44.618037 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" event={"ID":"98c072cc-8e2f-446c-b225-23f6d6e08ffd","Type":"ContainerStarted","Data":"bb6282df6113fc1e7f90312eaf17f161b56b9661ec8064503be27b816caa83e8"} Feb 24 00:08:44 crc kubenswrapper[4824]: I0224 00:08:44.692867 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:44 crc kubenswrapper[4824]: I0224 00:08:44.692904 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:44 crc kubenswrapper[4824]: I0224 00:08:44.693001 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:44 crc kubenswrapper[4824]: E0224 00:08:44.693287 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:44 crc kubenswrapper[4824]: E0224 00:08:44.693598 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:44 crc kubenswrapper[4824]: E0224 00:08:44.693730 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:45 crc kubenswrapper[4824]: I0224 00:08:45.693097 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:45 crc kubenswrapper[4824]: E0224 00:08:45.693412 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:46 crc kubenswrapper[4824]: I0224 00:08:46.695483 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:46 crc kubenswrapper[4824]: I0224 00:08:46.695622 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:46 crc kubenswrapper[4824]: E0224 00:08:46.695784 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:46 crc kubenswrapper[4824]: I0224 00:08:46.695996 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:46 crc kubenswrapper[4824]: E0224 00:08:46.696122 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:46 crc kubenswrapper[4824]: E0224 00:08:46.696257 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:46 crc kubenswrapper[4824]: E0224 00:08:46.790149 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:47 crc kubenswrapper[4824]: I0224 00:08:47.692704 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:47 crc kubenswrapper[4824]: E0224 00:08:47.692852 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:48 crc kubenswrapper[4824]: I0224 00:08:48.693038 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:48 crc kubenswrapper[4824]: E0224 00:08:48.693230 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:48 crc kubenswrapper[4824]: I0224 00:08:48.693341 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:48 crc kubenswrapper[4824]: E0224 00:08:48.693897 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:48 crc kubenswrapper[4824]: I0224 00:08:48.694001 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:48 crc kubenswrapper[4824]: E0224 00:08:48.694115 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:48 crc kubenswrapper[4824]: I0224 00:08:48.694151 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:08:48 crc kubenswrapper[4824]: E0224 00:08:48.694509 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xjg6_openshift-ovn-kubernetes(d985b875-dd5e-4767-a4e2-209894575a8f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" Feb 24 00:08:49 crc kubenswrapper[4824]: I0224 00:08:49.693535 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:49 crc kubenswrapper[4824]: E0224 00:08:49.693707 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:50 crc kubenswrapper[4824]: I0224 00:08:50.693366 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:50 crc kubenswrapper[4824]: I0224 00:08:50.693446 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:50 crc kubenswrapper[4824]: I0224 00:08:50.693464 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:50 crc kubenswrapper[4824]: E0224 00:08:50.693607 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:50 crc kubenswrapper[4824]: E0224 00:08:50.693702 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:50 crc kubenswrapper[4824]: E0224 00:08:50.693774 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:51 crc kubenswrapper[4824]: I0224 00:08:51.693084 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:51 crc kubenswrapper[4824]: E0224 00:08:51.693623 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:51 crc kubenswrapper[4824]: E0224 00:08:51.791615 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:52 crc kubenswrapper[4824]: I0224 00:08:52.693481 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:52 crc kubenswrapper[4824]: I0224 00:08:52.693597 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:52 crc kubenswrapper[4824]: I0224 00:08:52.693612 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:52 crc kubenswrapper[4824]: E0224 00:08:52.693729 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:52 crc kubenswrapper[4824]: E0224 00:08:52.693826 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:52 crc kubenswrapper[4824]: E0224 00:08:52.693973 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:53 crc kubenswrapper[4824]: I0224 00:08:53.693248 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:53 crc kubenswrapper[4824]: E0224 00:08:53.693652 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:54 crc kubenswrapper[4824]: I0224 00:08:54.693779 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:54 crc kubenswrapper[4824]: I0224 00:08:54.693827 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:54 crc kubenswrapper[4824]: I0224 00:08:54.693802 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:54 crc kubenswrapper[4824]: E0224 00:08:54.693963 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:54 crc kubenswrapper[4824]: E0224 00:08:54.694173 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:54 crc kubenswrapper[4824]: E0224 00:08:54.694370 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:55 crc kubenswrapper[4824]: I0224 00:08:55.693090 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:55 crc kubenswrapper[4824]: E0224 00:08:55.693481 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:56 crc kubenswrapper[4824]: I0224 00:08:56.692999 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:56 crc kubenswrapper[4824]: I0224 00:08:56.693083 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:56 crc kubenswrapper[4824]: I0224 00:08:56.693083 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:56 crc kubenswrapper[4824]: E0224 00:08:56.696075 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:56 crc kubenswrapper[4824]: E0224 00:08:56.696220 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:56 crc kubenswrapper[4824]: E0224 00:08:56.696390 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:56 crc kubenswrapper[4824]: E0224 00:08:56.792649 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:08:57 crc kubenswrapper[4824]: I0224 00:08:57.693036 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:57 crc kubenswrapper[4824]: E0224 00:08:57.693192 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.679696 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/1.log" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.681178 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/0.log" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.681255 4824 generic.go:334] "Generic (PLEG): container finished" podID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" containerID="a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06" exitCode=1 Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.681319 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerDied","Data":"a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06"} Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.681377 4824 scope.go:117] "RemoveContainer" containerID="4cafa37efd3e722940c0c2daae6107dc324136c75f889084d39711c20d7e887d" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.682249 4824 scope.go:117] "RemoveContainer" containerID="a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06" Feb 24 00:08:58 crc kubenswrapper[4824]: E0224 00:08:58.682641 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wvqfl_openshift-multus(15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac)\"" pod="openshift-multus/multus-wvqfl" podUID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.694143 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.694195 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:08:58 crc kubenswrapper[4824]: E0224 00:08:58.694302 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.694144 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:08:58 crc kubenswrapper[4824]: E0224 00:08:58.694492 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:08:58 crc kubenswrapper[4824]: E0224 00:08:58.694592 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:08:58 crc kubenswrapper[4824]: I0224 00:08:58.704672 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qcp8r" podStartSLOduration=128.704645773 podStartE2EDuration="2m8.704645773s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:08:44.642211327 +0000 UTC m=+188.631835796" watchObservedRunningTime="2026-02-24 00:08:58.704645773 +0000 UTC m=+202.694270252" Feb 24 00:08:59 crc kubenswrapper[4824]: I0224 00:08:59.687431 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/1.log" Feb 24 00:08:59 crc kubenswrapper[4824]: I0224 00:08:59.693714 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:08:59 crc kubenswrapper[4824]: E0224 00:08:59.694097 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:00 crc kubenswrapper[4824]: I0224 00:09:00.693398 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:00 crc kubenswrapper[4824]: I0224 00:09:00.693456 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:00 crc kubenswrapper[4824]: I0224 00:09:00.693583 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:00 crc kubenswrapper[4824]: E0224 00:09:00.693588 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:00 crc kubenswrapper[4824]: E0224 00:09:00.693741 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:00 crc kubenswrapper[4824]: E0224 00:09:00.693842 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:01 crc kubenswrapper[4824]: I0224 00:09:01.692818 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:01 crc kubenswrapper[4824]: E0224 00:09:01.693007 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:01 crc kubenswrapper[4824]: E0224 00:09:01.794195 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:09:02 crc kubenswrapper[4824]: I0224 00:09:02.693647 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:02 crc kubenswrapper[4824]: I0224 00:09:02.693805 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:02 crc kubenswrapper[4824]: E0224 00:09:02.693807 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:02 crc kubenswrapper[4824]: E0224 00:09:02.694014 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:02 crc kubenswrapper[4824]: I0224 00:09:02.694297 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:02 crc kubenswrapper[4824]: E0224 00:09:02.694550 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:03 crc kubenswrapper[4824]: I0224 00:09:03.693592 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:03 crc kubenswrapper[4824]: E0224 00:09:03.694443 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:03 crc kubenswrapper[4824]: I0224 00:09:03.695145 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.693367 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.693433 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.693433 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:04 crc kubenswrapper[4824]: E0224 00:09:04.693644 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:04 crc kubenswrapper[4824]: E0224 00:09:04.693743 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:04 crc kubenswrapper[4824]: E0224 00:09:04.693831 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.706113 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/3.log" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.709534 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerStarted","Data":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.710090 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.746952 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podStartSLOduration=134.746926636 podStartE2EDuration="2m14.746926636s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:04.746544376 +0000 UTC m=+208.736168865" watchObservedRunningTime="2026-02-24 00:09:04.746926636 +0000 UTC m=+208.736551125" Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.805814 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98z42"] Feb 24 00:09:04 crc kubenswrapper[4824]: I0224 00:09:04.806292 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:04 crc kubenswrapper[4824]: E0224 00:09:04.806399 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:05 crc kubenswrapper[4824]: I0224 00:09:05.693597 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:05 crc kubenswrapper[4824]: E0224 00:09:05.693967 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:06 crc kubenswrapper[4824]: I0224 00:09:06.693247 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:06 crc kubenswrapper[4824]: I0224 00:09:06.693304 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:06 crc kubenswrapper[4824]: I0224 00:09:06.693304 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:06 crc kubenswrapper[4824]: E0224 00:09:06.696069 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:06 crc kubenswrapper[4824]: E0224 00:09:06.696295 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:06 crc kubenswrapper[4824]: E0224 00:09:06.696408 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:06 crc kubenswrapper[4824]: E0224 00:09:06.795279 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:09:07 crc kubenswrapper[4824]: I0224 00:09:07.693130 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:07 crc kubenswrapper[4824]: E0224 00:09:07.693383 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.693582 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.693680 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.693732 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.694330 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.694757 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.694680 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.731081 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.731282 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.731315 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731377 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:11:10.731332748 +0000 UTC m=+334.720957297 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731444 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731461 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731473 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.731487 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731551 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:11:10.731512322 +0000 UTC m=+334.721136981 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:09:08 crc kubenswrapper[4824]: I0224 00:09:08.731595 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731617 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731835 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731854 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731910 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:11:10.731897432 +0000 UTC m=+334.721522101 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731710 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.732140 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:11:10.732132999 +0000 UTC m=+334.721757468 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.731733 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:09:08 crc kubenswrapper[4824]: E0224 00:09:08.732244 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:11:10.732237431 +0000 UTC m=+334.721861900 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 00:09:09 crc kubenswrapper[4824]: I0224 00:09:09.692864 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:09 crc kubenswrapper[4824]: E0224 00:09:09.693032 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:10 crc kubenswrapper[4824]: I0224 00:09:10.693030 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:10 crc kubenswrapper[4824]: I0224 00:09:10.693149 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:10 crc kubenswrapper[4824]: E0224 00:09:10.693280 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:10 crc kubenswrapper[4824]: E0224 00:09:10.693421 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:10 crc kubenswrapper[4824]: I0224 00:09:10.693753 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:10 crc kubenswrapper[4824]: E0224 00:09:10.693941 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:11 crc kubenswrapper[4824]: I0224 00:09:11.693186 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:11 crc kubenswrapper[4824]: E0224 00:09:11.693421 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:11 crc kubenswrapper[4824]: I0224 00:09:11.693907 4824 scope.go:117] "RemoveContainer" containerID="a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06" Feb 24 00:09:11 crc kubenswrapper[4824]: E0224 00:09:11.796544 4824 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 00:09:12 crc kubenswrapper[4824]: I0224 00:09:12.693243 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:12 crc kubenswrapper[4824]: I0224 00:09:12.693347 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:12 crc kubenswrapper[4824]: E0224 00:09:12.693414 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:12 crc kubenswrapper[4824]: E0224 00:09:12.693602 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:12 crc kubenswrapper[4824]: I0224 00:09:12.693786 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:12 crc kubenswrapper[4824]: E0224 00:09:12.693874 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:12 crc kubenswrapper[4824]: I0224 00:09:12.738846 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/1.log" Feb 24 00:09:12 crc kubenswrapper[4824]: I0224 00:09:12.738916 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerStarted","Data":"e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68"} Feb 24 00:09:13 crc kubenswrapper[4824]: I0224 00:09:13.692889 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:13 crc kubenswrapper[4824]: E0224 00:09:13.693404 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:14 crc kubenswrapper[4824]: I0224 00:09:14.693306 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:14 crc kubenswrapper[4824]: I0224 00:09:14.693390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:14 crc kubenswrapper[4824]: I0224 00:09:14.693488 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:14 crc kubenswrapper[4824]: E0224 00:09:14.693606 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:14 crc kubenswrapper[4824]: E0224 00:09:14.693794 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:14 crc kubenswrapper[4824]: E0224 00:09:14.693995 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:15 crc kubenswrapper[4824]: I0224 00:09:15.693072 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:15 crc kubenswrapper[4824]: E0224 00:09:15.693269 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:09:16 crc kubenswrapper[4824]: I0224 00:09:16.693577 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:16 crc kubenswrapper[4824]: I0224 00:09:16.693649 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:16 crc kubenswrapper[4824]: E0224 00:09:16.695987 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:09:16 crc kubenswrapper[4824]: I0224 00:09:16.696050 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:16 crc kubenswrapper[4824]: E0224 00:09:16.696142 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:09:16 crc kubenswrapper[4824]: E0224 00:09:16.696237 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-98z42" podUID="a648113f-3e46-4170-ba30-7155fefbb413" Feb 24 00:09:17 crc kubenswrapper[4824]: I0224 00:09:17.693362 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:09:17 crc kubenswrapper[4824]: I0224 00:09:17.696696 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 00:09:17 crc kubenswrapper[4824]: I0224 00:09:17.696868 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.693092 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.693180 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.693183 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.697066 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.697186 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.697731 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 00:09:18 crc kubenswrapper[4824]: I0224 00:09:18.698232 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 00:09:23 crc kubenswrapper[4824]: I0224 00:09:23.276933 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:09:23 crc kubenswrapper[4824]: I0224 00:09:23.277035 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:09:23 crc kubenswrapper[4824]: I0224 00:09:23.711842 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.030024 4824 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.080254 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.080824 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.083215 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.083404 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.084117 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kh6hg"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.084849 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.084852 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.085581 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.086161 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.086634 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.087099 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.087202 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vdck"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.087572 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.087750 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.088156 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.088252 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.093744 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.098142 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.098188 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.098239 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.098307 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.098360 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100246 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100302 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100315 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100398 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100415 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100491 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.100887 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101000 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101036 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101136 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101244 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101279 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101361 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101383 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101495 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101616 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101700 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101739 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.101818 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.102097 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.102303 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.102586 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.102814 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.102955 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.103239 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.103382 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.104026 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.104752 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.105421 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.110320 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.110590 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.113844 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.113980 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.114229 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.115256 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pvqdd"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.115424 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.115625 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.115818 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.118110 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-r4c4b"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.118545 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.118663 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.118938 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.119062 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.119107 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.128832 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.129206 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.130940 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.131154 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.131212 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.131155 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.131371 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7djl"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.132070 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.133793 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.134367 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.139573 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.139604 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.139986 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.142024 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfl22"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.148334 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.148572 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.148712 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.148903 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149070 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149269 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149414 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149579 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149595 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149654 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149711 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.149866 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.154790 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.155280 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p5tqf"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.155685 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.155822 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.157904 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158247 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158359 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158467 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158611 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158743 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.158806 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159021 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159195 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159311 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159416 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159539 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159639 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159728 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.159821 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.160090 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.160252 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.162583 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.162719 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.162763 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.163942 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.167652 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29531520-969xh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.168089 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.168335 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.168813 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.169131 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.169601 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.170083 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.170109 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.170256 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.172622 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.173320 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.173805 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.174123 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.176858 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.178105 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.178234 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.178607 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.179453 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.179632 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.179858 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.180302 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fp4wq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.180582 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.180695 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.180771 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.181263 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.181372 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.181758 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.182390 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.183645 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.189836 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.190379 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.191073 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.193482 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxlfh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.195858 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.214186 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.215281 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.228632 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.231094 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237672 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srljx\" (UniqueName: \"kubernetes.io/projected/390f4e92-8639-45bb-b91c-a55773bfa293-kube-api-access-srljx\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237730 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-dir\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237751 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237763 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-images\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237854 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237880 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237933 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237966 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.237997 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9qkd\" (UniqueName: \"kubernetes.io/projected/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-kube-api-access-n9qkd\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238148 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/53344821-2f26-459a-9e42-003f3f1b5a87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238193 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-service-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238316 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238778 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-policies\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238851 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-client\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238893 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238933 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-serving-cert\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238961 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-serving-cert\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.238983 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-serving-cert\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239015 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5js\" (UniqueName: \"kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239049 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-encryption-config\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239078 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239105 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjj7\" (UniqueName: \"kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239136 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239179 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-config\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239202 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctbsw\" (UniqueName: \"kubernetes.io/projected/53344821-2f26-459a-9e42-003f3f1b5a87-kube-api-access-ctbsw\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239238 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-serving-cert\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239303 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-config\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239351 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390f4e92-8639-45bb-b91c-a55773bfa293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239399 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239447 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmr92\" (UniqueName: \"kubernetes.io/projected/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-kube-api-access-hmr92\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239473 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-client\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239538 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239565 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239591 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zw9s\" (UniqueName: \"kubernetes.io/projected/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-kube-api-access-6zw9s\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239616 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239637 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4fs\" (UniqueName: \"kubernetes.io/projected/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-kube-api-access-4f4fs\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239662 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390f4e92-8639-45bb-b91c-a55773bfa293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239727 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239761 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239788 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj72b\" (UniqueName: \"kubernetes.io/projected/581e69ae-c21a-4a9e-b1ea-9c38256d7b30-kube-api-access-tj72b\") pod \"downloads-7954f5f757-r4c4b\" (UID: \"581e69ae-c21a-4a9e-b1ea-9c38256d7b30\") " pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.239831 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-config\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.241262 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.243413 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.244810 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.245258 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.245630 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.245786 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.245786 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.246133 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.246329 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.246489 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.246535 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.246913 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.249300 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8krrp"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.249511 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.249845 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.249954 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.250342 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.250788 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.251660 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.252072 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6h296"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.253925 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.254004 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.254658 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.255050 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.255165 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.255997 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zlnwh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.256563 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.257227 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kh6hg"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.258778 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvd9"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.259977 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.260327 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.261868 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pvqdd"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.262998 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r4c4b"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.264334 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.265561 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.266930 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29531520-969xh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.268224 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.269506 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.270801 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.272038 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.273213 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxlfh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.274154 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.275198 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.276101 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.277249 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.278073 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.279984 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.280975 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p5tqf"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.283309 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.285502 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.290465 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.291599 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vdck"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.294214 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.295596 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dq9gz"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.297353 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vvlvv"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.297550 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.298044 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.298302 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7djl"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.299497 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvd9"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.300987 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.306277 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8krrp"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.308333 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfl22"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.309021 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.312950 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zlnwh"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.314007 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.319362 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.321505 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.323126 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.324729 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.327464 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.328974 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.330345 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dq9gz"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.332076 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6h296"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.333318 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.334349 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.335439 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5n768"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.336479 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5n768"] Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.336602 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n768" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341144 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-config\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341191 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srljx\" (UniqueName: \"kubernetes.io/projected/390f4e92-8639-45bb-b91c-a55773bfa293-kube-api-access-srljx\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341219 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-images\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341239 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-dir\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341257 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341274 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341297 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2tmt\" (UniqueName: \"kubernetes.io/projected/b14c3eec-796c-48b0-b4fe-67cb327f2de7-kube-api-access-c2tmt\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341326 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/13bff804-f118-473b-a547-433aed671b46-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341352 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14c3eec-796c-48b0-b4fe-67cb327f2de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341359 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-dir\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341384 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.341697 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342546 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342728 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342777 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342810 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342759 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-images\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342840 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342923 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9qkd\" (UniqueName: \"kubernetes.io/projected/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-kube-api-access-n9qkd\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.342967 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/53344821-2f26-459a-9e42-003f3f1b5a87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343007 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-images\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343036 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-proxy-tls\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343062 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-service-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343087 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343118 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343144 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dl5\" (UniqueName: \"kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343176 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-policies\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343203 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/250422bb-6e8f-4622-a456-ded5825e7c86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343234 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-client\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343265 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343305 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343327 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b14c3eec-796c-48b0-b4fe-67cb327f2de7-proxy-tls\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343354 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-serving-cert\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343376 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343399 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-serving-cert\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343427 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5js\" (UniqueName: \"kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343449 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-serving-cert\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343476 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-encryption-config\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343496 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343537 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250422bb-6e8f-4622-a456-ded5825e7c86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343566 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjj7\" (UniqueName: \"kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343593 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343638 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fxs\" (UniqueName: \"kubernetes.io/projected/13bff804-f118-473b-a547-433aed671b46-kube-api-access-85fxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343656 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343667 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-config\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343718 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-service-ca\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343729 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctbsw\" (UniqueName: \"kubernetes.io/projected/53344821-2f26-459a-9e42-003f3f1b5a87-kube-api-access-ctbsw\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343795 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk6p9\" (UniqueName: \"kubernetes.io/projected/250422bb-6e8f-4622-a456-ded5825e7c86-kube-api-access-sk6p9\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343833 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-serving-cert\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343852 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-config\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343870 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390f4e92-8639-45bb-b91c-a55773bfa293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343893 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8118fe3c-1479-4634-9b64-9350991d909d-metrics-tls\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343917 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343927 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343940 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343935 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmr92\" (UniqueName: \"kubernetes.io/projected/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-kube-api-access-hmr92\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.343988 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-client\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344019 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55rl5\" (UniqueName: \"kubernetes.io/projected/8118fe3c-1479-4634-9b64-9350991d909d-kube-api-access-55rl5\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344047 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgctm\" (UniqueName: \"kubernetes.io/projected/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-kube-api-access-vgctm\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344068 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344093 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344117 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344142 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zw9s\" (UniqueName: \"kubernetes.io/projected/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-kube-api-access-6zw9s\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344170 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344193 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4fs\" (UniqueName: \"kubernetes.io/projected/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-kube-api-access-4f4fs\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344214 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390f4e92-8639-45bb-b91c-a55773bfa293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344232 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344270 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-config\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344280 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344341 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344369 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344398 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344426 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj72b\" (UniqueName: \"kubernetes.io/projected/581e69ae-c21a-4a9e-b1ea-9c38256d7b30-kube-api-access-tj72b\") pod \"downloads-7954f5f757-r4c4b\" (UID: \"581e69ae-c21a-4a9e-b1ea-9c38256d7b30\") " pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344458 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344617 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53344821-2f26-459a-9e42-003f3f1b5a87-config\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.344790 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-audit-policies\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.345199 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/390f4e92-8639-45bb-b91c-a55773bfa293-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.345274 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.346092 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.346355 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.346582 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-service-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.347211 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.347279 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.347356 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.348113 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.349091 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.349159 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-etcd-client\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.349266 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-config\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.349358 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-serving-cert\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.349471 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/53344821-2f26-459a-9e42-003f3f1b5a87-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.350222 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390f4e92-8639-45bb-b91c-a55773bfa293-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.350598 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-serving-cert\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.351652 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.351799 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-encryption-config\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.352708 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-serving-cert\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.352712 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-serving-cert\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.353932 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-etcd-client\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.363615 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.365259 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.383765 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.403875 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.423833 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.443430 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445190 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250422bb-6e8f-4622-a456-ded5825e7c86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445256 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fxs\" (UniqueName: \"kubernetes.io/projected/13bff804-f118-473b-a547-433aed671b46-kube-api-access-85fxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445284 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk6p9\" (UniqueName: \"kubernetes.io/projected/250422bb-6e8f-4622-a456-ded5825e7c86-kube-api-access-sk6p9\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445313 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8118fe3c-1479-4634-9b64-9350991d909d-metrics-tls\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445338 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55rl5\" (UniqueName: \"kubernetes.io/projected/8118fe3c-1479-4634-9b64-9350991d909d-kube-api-access-55rl5\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445356 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445377 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgctm\" (UniqueName: \"kubernetes.io/projected/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-kube-api-access-vgctm\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445404 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445444 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445463 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445490 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445554 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2tmt\" (UniqueName: \"kubernetes.io/projected/b14c3eec-796c-48b0-b4fe-67cb327f2de7-kube-api-access-c2tmt\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445575 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/13bff804-f118-473b-a547-433aed671b46-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445597 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14c3eec-796c-48b0-b4fe-67cb327f2de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445623 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445642 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445662 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445684 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445704 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445731 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-images\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445749 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-proxy-tls\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445770 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445789 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445809 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dl5\" (UniqueName: \"kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445828 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/250422bb-6e8f-4622-a456-ded5825e7c86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445868 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445886 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b14c3eec-796c-48b0-b4fe-67cb327f2de7-proxy-tls\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.445906 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.446199 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.447296 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.447555 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.448206 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-images\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.448338 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b14c3eec-796c-48b0-b4fe-67cb327f2de7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.448789 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.449611 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450061 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450143 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450243 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8118fe3c-1479-4634-9b64-9350991d909d-metrics-tls\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450318 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/250422bb-6e8f-4622-a456-ded5825e7c86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450508 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.450726 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/250422bb-6e8f-4622-a456-ded5825e7c86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.451761 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.452507 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.452740 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.453146 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.454146 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-proxy-tls\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.454309 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.455881 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.463505 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.483537 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.503925 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.523275 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.543789 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.563213 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.583435 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.604018 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.624600 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.644202 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.664105 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.686148 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.704171 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.724070 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.743774 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.762407 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.782801 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.791647 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b14c3eec-796c-48b0-b4fe-67cb327f2de7-proxy-tls\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.803998 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.824075 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.830688 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/13bff804-f118-473b-a547-433aed671b46-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.844021 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.887477 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.889975 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.903928 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.923748 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.943707 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 00:09:24 crc kubenswrapper[4824]: I0224 00:09:24.963180 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.003346 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.024714 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.043559 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.064513 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.092721 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.103204 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.123433 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.142779 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.183446 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.183725 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.203844 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.223363 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.243966 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.261942 4824 request.go:700] Waited for 1.01329131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dopenshift-kube-scheduler-operator-dockercfg-qt55r&limit=500&resourceVersion=0 Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.263827 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.284205 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.305091 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.323120 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.343262 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.364103 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.384652 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.403873 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.424318 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.443646 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.464279 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.484621 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.504240 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.533507 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.543962 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.564055 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.583853 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.604577 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.623585 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.643940 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.663633 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.684484 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.705199 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.724127 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.743193 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.763270 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.784724 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.808771 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.825205 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.844503 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.873262 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.883575 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.912380 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.922945 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.943119 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.963623 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 00:09:25 crc kubenswrapper[4824]: I0224 00:09:25.983450 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.023479 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.044066 4824 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.064217 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.085869 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.103979 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.124104 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.158721 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.163365 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.183762 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.234726 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srljx\" (UniqueName: \"kubernetes.io/projected/390f4e92-8639-45bb-b91c-a55773bfa293-kube-api-access-srljx\") pod \"openshift-apiserver-operator-796bbdcf4f-m7jnm\" (UID: \"390f4e92-8639-45bb-b91c-a55773bfa293\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.246344 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.254282 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9qkd\" (UniqueName: \"kubernetes.io/projected/01ed973e-7ed7-41ec-bea9-69d8c86e19ed-kube-api-access-n9qkd\") pod \"openshift-config-operator-7777fb866f-9ml5g\" (UID: \"01ed973e-7ed7-41ec-bea9-69d8c86e19ed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.262349 4824 request.go:700] Waited for 1.918316996s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.281338 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctbsw\" (UniqueName: \"kubernetes.io/projected/53344821-2f26-459a-9e42-003f3f1b5a87-kube-api-access-ctbsw\") pod \"machine-api-operator-5694c8668f-kh6hg\" (UID: \"53344821-2f26-459a-9e42-003f3f1b5a87\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.284807 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmr92\" (UniqueName: \"kubernetes.io/projected/836fad19-b7d1-434c-9fd8-faf3eb1d80d1-kube-api-access-hmr92\") pod \"apiserver-7bbb656c7d-jkghx\" (UID: \"836fad19-b7d1-434c-9fd8-faf3eb1d80d1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.305142 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zw9s\" (UniqueName: \"kubernetes.io/projected/2595ad7b-d8cc-46d2-b0db-60e7ce636e9f-kube-api-access-6zw9s\") pod \"authentication-operator-69f744f599-7vdck\" (UID: \"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.330177 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5js\" (UniqueName: \"kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js\") pod \"route-controller-manager-6576b87f9c-9k27r\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.340545 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4fs\" (UniqueName: \"kubernetes.io/projected/d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110-kube-api-access-4f4fs\") pod \"etcd-operator-b45778765-pvqdd\" (UID: \"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.346200 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.356751 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.368205 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj72b\" (UniqueName: \"kubernetes.io/projected/581e69ae-c21a-4a9e-b1ea-9c38256d7b30-kube-api-access-tj72b\") pod \"downloads-7954f5f757-r4c4b\" (UID: \"581e69ae-c21a-4a9e-b1ea-9c38256d7b30\") " pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.368536 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.370809 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.381920 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.386962 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjj7\" (UniqueName: \"kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7\") pod \"controller-manager-879f6c89f-jm7qk\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.403183 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fxs\" (UniqueName: \"kubernetes.io/projected/13bff804-f118-473b-a547-433aed671b46-kube-api-access-85fxs\") pod \"control-plane-machine-set-operator-78cbb6b69f-q8hvw\" (UID: \"13bff804-f118-473b-a547-433aed671b46\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.433580 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55rl5\" (UniqueName: \"kubernetes.io/projected/8118fe3c-1479-4634-9b64-9350991d909d-kube-api-access-55rl5\") pod \"dns-operator-744455d44c-h7djl\" (UID: \"8118fe3c-1479-4634-9b64-9350991d909d\") " pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.442344 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk6p9\" (UniqueName: \"kubernetes.io/projected/250422bb-6e8f-4622-a456-ded5825e7c86-kube-api-access-sk6p9\") pod \"openshift-controller-manager-operator-756b6f6bc6-lwm9v\" (UID: \"250422bb-6e8f-4622-a456-ded5825e7c86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.462855 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2tmt\" (UniqueName: \"kubernetes.io/projected/b14c3eec-796c-48b0-b4fe-67cb327f2de7-kube-api-access-c2tmt\") pod \"machine-config-controller-84d6567774-ksmk7\" (UID: \"b14c3eec-796c-48b0-b4fe-67cb327f2de7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.482658 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dl5\" (UniqueName: \"kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5\") pod \"oauth-openshift-558db77b4-jf5jw\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.503676 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.513811 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.514297 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgctm\" (UniqueName: \"kubernetes.io/projected/33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4-kube-api-access-vgctm\") pod \"machine-config-operator-74547568cd-hl2rv\" (UID: \"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.525940 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.530443 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.579987 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.595042 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm"] Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605694 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-node-pullsecrets\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605744 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-auth-proxy-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605761 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4349e271-7758-4dcf-9053-fbc984436a8b-machine-approver-tls\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605790 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxf6\" (UniqueName: \"kubernetes.io/projected/5b0ff99f-1e04-4e23-895a-a02a303c8daa-kube-api-access-5dxf6\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605840 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3de326-8359-4a7c-84da-57a071a929d7-config\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605868 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ac86b042-947d-402f-a7c8-bb0a69d3f86e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605915 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-srv-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605929 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-profile-collector-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605945 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb795a58-0029-416c-84fa-ae83cf338858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.605976 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606005 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606029 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb795a58-0029-416c-84fa-ae83cf338858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606045 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dh8\" (UniqueName: \"kubernetes.io/projected/2f39cab4-77fc-4641-9e84-c01b0dedc300-kube-api-access-r7dh8\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606062 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-encryption-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606109 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606127 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606142 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-serving-cert\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606161 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdkzl\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606191 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606208 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-stats-auth\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606224 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-metrics-certs\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606252 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68sp\" (UniqueName: \"kubernetes.io/projected/f66ddecd-538b-48bd-a335-e7f99181daa0-kube-api-access-w68sp\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606266 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3de326-8359-4a7c-84da-57a071a929d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606288 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp68j\" (UniqueName: \"kubernetes.io/projected/44376c4d-d433-41a2-bdc1-22a9792e7640-kube-api-access-lp68j\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606316 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-trusted-ca\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606333 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d3de326-8359-4a7c-84da-57a071a929d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606360 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb795a58-0029-416c-84fa-ae83cf338858-config\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606390 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606408 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606445 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7ql\" (UniqueName: \"kubernetes.io/projected/ac86b042-947d-402f-a7c8-bb0a69d3f86e-kube-api-access-fb7ql\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606478 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606632 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-audit\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606658 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606678 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-config\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606698 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc987\" (UniqueName: \"kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606716 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b0ff99f-1e04-4e23-895a-a02a303c8daa-service-ca-bundle\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606734 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-client\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606751 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-audit-dir\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606772 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-image-import-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606789 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-default-certificate\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606829 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606844 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w65w\" (UniqueName: \"kubernetes.io/projected/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-kube-api-access-5w65w\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606865 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44376c4d-d433-41a2-bdc1-22a9792e7640-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606886 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606922 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606939 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f39cab4-77fc-4641-9e84-c01b0dedc300-serving-cert\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.606953 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjv8j\" (UniqueName: \"kubernetes.io/projected/4349e271-7758-4dcf-9053-fbc984436a8b-kube-api-access-mjv8j\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: E0224 00:09:26.607967 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.107944942 +0000 UTC m=+231.097569411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.608428 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.615942 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.691984 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.707948 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:26 crc kubenswrapper[4824]: E0224 00:09:26.708169 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.208124835 +0000 UTC m=+231.197749304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708229 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3de326-8359-4a7c-84da-57a071a929d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708286 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb795a58-0029-416c-84fa-ae83cf338858-config\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708312 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708367 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp68j\" (UniqueName: \"kubernetes.io/projected/44376c4d-d433-41a2-bdc1-22a9792e7640-kube-api-access-lp68j\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708412 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-node-bootstrap-token\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708436 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708456 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcnv\" (UniqueName: \"kubernetes.io/projected/a7389587-c14d-45bd-b642-4ba3b5d7ac41-kube-api-access-xdcnv\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708510 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54cdfa0a-fdb0-4509-9d56-01194a25ee63-metrics-tls\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708587 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708661 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b0ff99f-1e04-4e23-895a-a02a303c8daa-service-ca-bundle\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708702 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708724 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4cca2c2a-43ba-4b84-b10b-25053c6d7350-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708744 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708783 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7389587-c14d-45bd-b642-4ba3b5d7ac41-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708805 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-image-import-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708827 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-default-certificate\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708865 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxwh\" (UniqueName: \"kubernetes.io/projected/42d75b69-be96-43de-8687-444a81d8ebd5-kube-api-access-bnxwh\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708902 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708941 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.708980 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f39cab4-77fc-4641-9e84-c01b0dedc300-serving-cert\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709026 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-srv-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709062 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4349e271-7758-4dcf-9053-fbc984436a8b-machine-approver-tls\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709107 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c8d25-266d-4f40-9b7c-e1697b87db51-tmpfs\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709140 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7389587-c14d-45bd-b642-4ba3b5d7ac41-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709189 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3de326-8359-4a7c-84da-57a071a929d7-config\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709207 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-metrics-tls\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709224 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thqj\" (UniqueName: \"kubernetes.io/projected/c15a2653-454b-42e4-85b5-87b99cc30198-kube-api-access-4thqj\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709266 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb795a58-0029-416c-84fa-ae83cf338858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709292 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-webhook-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709347 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-profile-collector-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709413 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-config-volume\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709430 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-kube-api-access-2hgp7\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709510 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb795a58-0029-416c-84fa-ae83cf338858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709563 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-encryption-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709625 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709656 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvwdb\" (UniqueName: \"kubernetes.io/projected/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-kube-api-access-hvwdb\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709679 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cca2c2a-43ba-4b84-b10b-25053c6d7350-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709706 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709746 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709833 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-mountpoint-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709893 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709918 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-stats-auth\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709969 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-key\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.709988 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68sp\" (UniqueName: \"kubernetes.io/projected/f66ddecd-538b-48bd-a335-e7f99181daa0-kube-api-access-w68sp\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710049 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710076 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-service-ca\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710164 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-trusted-ca\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710207 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d3de326-8359-4a7c-84da-57a071a929d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710238 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710277 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710298 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710317 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7ql\" (UniqueName: \"kubernetes.io/projected/ac86b042-947d-402f-a7c8-bb0a69d3f86e-kube-api-access-fb7ql\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710371 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-audit\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710392 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-cabundle\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710430 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54cdfa0a-fdb0-4509-9d56-01194a25ee63-trusted-ca\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710449 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9d5\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-kube-api-access-7v9d5\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710468 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-config\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710484 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc987\" (UniqueName: \"kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710535 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710558 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-oauth-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710617 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-client\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710637 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-audit-dir\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710700 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e3160a-60c7-424c-b5a3-53841213467d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710729 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710780 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w65w\" (UniqueName: \"kubernetes.io/projected/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-kube-api-access-5w65w\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710803 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710856 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44376c4d-d433-41a2-bdc1-22a9792e7640-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710915 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb795a58-0029-416c-84fa-ae83cf338858-config\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710920 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.710972 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f76k\" (UniqueName: \"kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711018 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjv8j\" (UniqueName: \"kubernetes.io/projected/4349e271-7758-4dcf-9053-fbc984436a8b-kube-api-access-mjv8j\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711039 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zzs\" (UniqueName: \"kubernetes.io/projected/67b600b4-d056-4b5f-b75e-0502de432461-kube-api-access-h6zzs\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711063 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711093 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-node-pullsecrets\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711112 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-auth-proxy-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711136 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxf6\" (UniqueName: \"kubernetes.io/projected/5b0ff99f-1e04-4e23-895a-a02a303c8daa-kube-api-access-5dxf6\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711160 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e3160a-60c7-424c-b5a3-53841213467d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711178 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-trusted-ca-bundle\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711219 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-registration-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711249 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-apiservice-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711291 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b600b4-d056-4b5f-b75e-0502de432461-serving-cert\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711313 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ac86b042-947d-402f-a7c8-bb0a69d3f86e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711330 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbm6\" (UniqueName: \"kubernetes.io/projected/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-kube-api-access-bzbm6\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711349 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e3160a-60c7-424c-b5a3-53841213467d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711367 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b600b4-d056-4b5f-b75e-0502de432461-config\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711408 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-srv-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711430 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-cert\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711449 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-oauth-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711469 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddjs\" (UniqueName: \"kubernetes.io/projected/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-kube-api-access-2ddjs\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711497 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711540 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-csi-data-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711562 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8wj\" (UniqueName: \"kubernetes.io/projected/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-kube-api-access-qb8wj\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711581 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-certs\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711722 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711744 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-plugins-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711772 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfr7v\" (UniqueName: \"kubernetes.io/projected/ac257861-33c1-4e92-9d58-bb7351f6316e-kube-api-access-gfr7v\") pod \"migrator-59844c95c7-5ltfz\" (UID: \"ac257861-33c1-4e92-9d58-bb7351f6316e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711796 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dh8\" (UniqueName: \"kubernetes.io/projected/2f39cab4-77fc-4641-9e84-c01b0dedc300-kube-api-access-r7dh8\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711817 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711836 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-socket-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711871 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-serving-cert\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711890 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdkzl\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711915 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4rm\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-kube-api-access-sr4rm\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711933 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvmgf\" (UniqueName: \"kubernetes.io/projected/757dc1d0-9507-4470-8496-9162b8999465-kube-api-access-xvmgf\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711961 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glzv\" (UniqueName: \"kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711978 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdfv\" (UniqueName: \"kubernetes.io/projected/fc6c8d25-266d-4f40-9b7c-e1697b87db51-kube-api-access-gsdfv\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.711998 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-metrics-certs\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.714607 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.714933 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b0ff99f-1e04-4e23-895a-a02a303c8daa-service-ca-bundle\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.715542 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-node-pullsecrets\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.715555 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.716177 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3de326-8359-4a7c-84da-57a071a929d7-config\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.716790 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-auth-proxy-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.717006 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-image-import-ca\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.717238 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.719600 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-audit\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.721346 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3de326-8359-4a7c-84da-57a071a929d7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.722086 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-stats-auth\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.722369 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-metrics-certs\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.722934 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-profile-collector-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.723357 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5b0ff99f-1e04-4e23-895a-a02a303c8daa-default-certificate\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.724663 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-trusted-ca\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.725689 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.727113 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.727589 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f66ddecd-538b-48bd-a335-e7f99181daa0-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.728332 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f66ddecd-538b-48bd-a335-e7f99181daa0-audit-dir\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.730164 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4349e271-7758-4dcf-9053-fbc984436a8b-config\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: E0224 00:09:26.730841 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.230812189 +0000 UTC m=+231.220436658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.731448 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb795a58-0029-416c-84fa-ae83cf338858-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.731637 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f39cab4-77fc-4641-9e84-c01b0dedc300-config\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.731954 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.732545 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.734171 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44376c4d-d433-41a2-bdc1-22a9792e7640-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.737205 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-etcd-client\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.742886 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f39cab4-77fc-4641-9e84-c01b0dedc300-serving-cert\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.743883 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.745282 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-srv-cert\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.746365 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4349e271-7758-4dcf-9053-fbc984436a8b-machine-approver-tls\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.746744 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-encryption-config\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.747139 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ac86b042-947d-402f-a7c8-bb0a69d3f86e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.748495 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f66ddecd-538b-48bd-a335-e7f99181daa0-serving-cert\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.763553 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb795a58-0029-416c-84fa-ae83cf338858-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8dlbn\" (UID: \"bb795a58-0029-416c-84fa-ae83cf338858\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.792561 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjv8j\" (UniqueName: \"kubernetes.io/projected/4349e271-7758-4dcf-9053-fbc984436a8b-kube-api-access-mjv8j\") pod \"machine-approver-56656f9798-vgr74\" (UID: \"4349e271-7758-4dcf-9053-fbc984436a8b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.812957 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813142 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813175 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e3160a-60c7-424c-b5a3-53841213467d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813197 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-trusted-ca-bundle\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813217 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-registration-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813242 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-apiservice-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813262 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b600b4-d056-4b5f-b75e-0502de432461-serving-cert\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813285 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbm6\" (UniqueName: \"kubernetes.io/projected/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-kube-api-access-bzbm6\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813301 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e3160a-60c7-424c-b5a3-53841213467d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813318 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b600b4-d056-4b5f-b75e-0502de432461-config\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813339 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-cert\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813363 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-csi-data-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813380 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8wj\" (UniqueName: \"kubernetes.io/projected/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-kube-api-access-qb8wj\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813401 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-certs\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813417 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-oauth-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813436 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddjs\" (UniqueName: \"kubernetes.io/projected/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-kube-api-access-2ddjs\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813454 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-plugins-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813476 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfr7v\" (UniqueName: \"kubernetes.io/projected/ac257861-33c1-4e92-9d58-bb7351f6316e-kube-api-access-gfr7v\") pod \"migrator-59844c95c7-5ltfz\" (UID: \"ac257861-33c1-4e92-9d58-bb7351f6316e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813500 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-socket-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813543 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4rm\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-kube-api-access-sr4rm\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813562 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvmgf\" (UniqueName: \"kubernetes.io/projected/757dc1d0-9507-4470-8496-9162b8999465-kube-api-access-xvmgf\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813587 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glzv\" (UniqueName: \"kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813605 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdfv\" (UniqueName: \"kubernetes.io/projected/fc6c8d25-266d-4f40-9b7c-e1697b87db51-kube-api-access-gsdfv\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813634 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813652 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-node-bootstrap-token\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813680 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcnv\" (UniqueName: \"kubernetes.io/projected/a7389587-c14d-45bd-b642-4ba3b5d7ac41-kube-api-access-xdcnv\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813698 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54cdfa0a-fdb0-4509-9d56-01194a25ee63-metrics-tls\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813718 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813737 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4cca2c2a-43ba-4b84-b10b-25053c6d7350-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813754 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813772 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxwh\" (UniqueName: \"kubernetes.io/projected/42d75b69-be96-43de-8687-444a81d8ebd5-kube-api-access-bnxwh\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813789 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7389587-c14d-45bd-b642-4ba3b5d7ac41-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813812 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813832 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-srv-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813851 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c8d25-266d-4f40-9b7c-e1697b87db51-tmpfs\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813871 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7389587-c14d-45bd-b642-4ba3b5d7ac41-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813890 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-metrics-tls\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813908 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4thqj\" (UniqueName: \"kubernetes.io/projected/c15a2653-454b-42e4-85b5-87b99cc30198-kube-api-access-4thqj\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813933 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-webhook-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813962 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-config-volume\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.813991 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-kube-api-access-2hgp7\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814020 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvwdb\" (UniqueName: \"kubernetes.io/projected/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-kube-api-access-hvwdb\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814039 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cca2c2a-43ba-4b84-b10b-25053c6d7350-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814060 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814078 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-mountpoint-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814095 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-key\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814126 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814143 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-service-ca\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814159 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814206 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-cabundle\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814223 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54cdfa0a-fdb0-4509-9d56-01194a25ee63-trusted-ca\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814246 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814262 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9d5\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-kube-api-access-7v9d5\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814280 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-oauth-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814297 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e3160a-60c7-424c-b5a3-53841213467d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814323 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814344 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f76k\" (UniqueName: \"kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: E0224 00:09:26.814370 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.314345046 +0000 UTC m=+231.303969505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.814431 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zzs\" (UniqueName: \"kubernetes.io/projected/67b600b4-d056-4b5f-b75e-0502de432461-kube-api-access-h6zzs\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.816812 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cca2c2a-43ba-4b84-b10b-25053c6d7350-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.817573 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-config-volume\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.817622 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-csi-data-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.818966 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-cabundle\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.819044 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-mountpoint-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.823086 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54cdfa0a-fdb0-4509-9d56-01194a25ee63-trusted-ca\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.823437 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-plugins-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.823483 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxf6\" (UniqueName: \"kubernetes.io/projected/5b0ff99f-1e04-4e23-895a-a02a303c8daa-kube-api-access-5dxf6\") pod \"router-default-5444994796-fp4wq\" (UID: \"5b0ff99f-1e04-4e23-895a-a02a303c8daa\") " pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.824680 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67b600b4-d056-4b5f-b75e-0502de432461-config\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.825204 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e3160a-60c7-424c-b5a3-53841213467d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.825634 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-socket-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.826387 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-node-bootstrap-token\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.826447 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42d75b69-be96-43de-8687-444a81d8ebd5-registration-dir\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.826480 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.827808 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.828083 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-trusted-ca-bundle\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.829261 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-service-ca\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.829656 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3e3160a-60c7-424c-b5a3-53841213467d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.830404 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.831022 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.831570 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7389587-c14d-45bd-b642-4ba3b5d7ac41-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.832053 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/757dc1d0-9507-4470-8496-9162b8999465-certs\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.833529 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7389587-c14d-45bd-b642-4ba3b5d7ac41-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.833722 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-cert\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.834214 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-signing-key\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.834700 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67b600b4-d056-4b5f-b75e-0502de432461-serving-cert\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.834890 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-apiservice-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.836558 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68sp\" (UniqueName: \"kubernetes.io/projected/f66ddecd-538b-48bd-a335-e7f99181daa0-kube-api-access-w68sp\") pod \"apiserver-76f77b778f-xfl22\" (UID: \"f66ddecd-538b-48bd-a335-e7f99181daa0\") " pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.837683 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.838000 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.838019 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4cca2c2a-43ba-4b84-b10b-25053c6d7350-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.838318 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c8d25-266d-4f40-9b7c-e1697b87db51-tmpfs\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.840001 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc6c8d25-266d-4f40-9b7c-e1697b87db51-webhook-cert\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.841391 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-oauth-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.841572 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-oauth-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.842244 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc987\" (UniqueName: \"kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987\") pod \"image-pruner-29531520-969xh\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.843882 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-metrics-tls\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.845511 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54cdfa0a-fdb0-4509-9d56-01194a25ee63-metrics-tls\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.850633 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.851686 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-config\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.852261 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c15a2653-454b-42e4-85b5-87b99cc30198-srv-cert\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.856285 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-console-serving-cert\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.870291 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r4c4b"] Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.870394 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" event={"ID":"390f4e92-8639-45bb-b91c-a55773bfa293","Type":"ContainerStarted","Data":"9396a3100e1df571d38d3b83628363dae8c0bfbe194d9374bbc454ad754af95b"} Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.871716 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7ql\" (UniqueName: \"kubernetes.io/projected/ac86b042-947d-402f-a7c8-bb0a69d3f86e-kube-api-access-fb7ql\") pod \"multus-admission-controller-857f4d67dd-cxlfh\" (UID: \"ac86b042-947d-402f-a7c8-bb0a69d3f86e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.887006 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dh8\" (UniqueName: \"kubernetes.io/projected/2f39cab4-77fc-4641-9e84-c01b0dedc300-kube-api-access-r7dh8\") pod \"console-operator-58897d9998-p5tqf\" (UID: \"2f39cab4-77fc-4641-9e84-c01b0dedc300\") " pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.891878 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pvqdd"] Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.897499 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.900395 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5d3de326-8359-4a7c-84da-57a071a929d7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-b2k2f\" (UID: \"5d3de326-8359-4a7c-84da-57a071a929d7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.904475 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7vdck"] Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.915345 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: E0224 00:09:26.915879 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.415865694 +0000 UTC m=+231.405490163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.924808 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.925334 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp68j\" (UniqueName: \"kubernetes.io/projected/44376c4d-d433-41a2-bdc1-22a9792e7640-kube-api-access-lp68j\") pod \"cluster-samples-operator-665b6dd947-tjdhm\" (UID: \"44376c4d-d433-41a2-bdc1-22a9792e7640\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.937583 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w65w\" (UniqueName: \"kubernetes.io/projected/a4a8a6d0-c052-4bca-9be8-dddd6d2ef017-kube-api-access-5w65w\") pod \"catalog-operator-68c6474976-rvv9j\" (UID: \"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.961873 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdkzl\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.978786 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:26 crc kubenswrapper[4824]: I0224 00:09:26.995216 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.017989 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.018192 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.518136732 +0000 UTC m=+231.507761201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.019312 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.019833 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.519818806 +0000 UTC m=+231.509443275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.021837 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.024057 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.030498 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4rm\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-kube-api-access-sr4rm\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.035345 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.047875 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4e1d48_7f8d_44b6_97b3_3ceccb35385b.slice/crio-d168e7b48bb45e2f3eeaabaaae34172927392199794b2e25a747dcf303c33d18 WatchSource:0}: Error finding container d168e7b48bb45e2f3eeaabaaae34172927392199794b2e25a747dcf303c33d18: Status 404 returned error can't find the container with id d168e7b48bb45e2f3eeaabaaae34172927392199794b2e25a747dcf303c33d18 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.049621 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.053437 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f76k\" (UniqueName: \"kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k\") pod \"marketplace-operator-79b997595-99tkw\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.079274 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glzv\" (UniqueName: \"kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv\") pod \"collect-profiles-29531520-xxvzq\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.080686 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.085366 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvmgf\" (UniqueName: \"kubernetes.io/projected/757dc1d0-9507-4470-8496-9162b8999465-kube-api-access-xvmgf\") pod \"machine-config-server-vvlvv\" (UID: \"757dc1d0-9507-4470-8496-9162b8999465\") " pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.104406 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.104818 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zzs\" (UniqueName: \"kubernetes.io/projected/67b600b4-d056-4b5f-b75e-0502de432461-kube-api-access-h6zzs\") pod \"service-ca-operator-777779d784-6h296\" (UID: \"67b600b4-d056-4b5f-b75e-0502de432461\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.115480 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kh6hg"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.124044 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdfv\" (UniqueName: \"kubernetes.io/projected/fc6c8d25-266d-4f40-9b7c-e1697b87db51-kube-api-access-gsdfv\") pod \"packageserver-d55dfcdfc-d95c9\" (UID: \"fc6c8d25-266d-4f40-9b7c-e1697b87db51\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.127617 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vvlvv" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.132045 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.135960 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.135991 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.136894 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.636875131 +0000 UTC m=+231.626499600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.138090 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.143198 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53344821_2f26_459a_9e42_003f3f1b5a87.slice/crio-e2e6496dd9b65bd1a5092c51c904c096168802f1d2367b1a5b29b292de68f360 WatchSource:0}: Error finding container e2e6496dd9b65bd1a5092c51c904c096168802f1d2367b1a5b29b292de68f360: Status 404 returned error can't find the container with id e2e6496dd9b65bd1a5092c51c904c096168802f1d2367b1a5b29b292de68f360 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.146181 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/cb0a4d10-0131-49ec-97f5-e77f1f222cdd-kube-api-access-2hgp7\") pod \"service-ca-9c57cc56f-8krrp\" (UID: \"cb0a4d10-0131-49ec-97f5-e77f1f222cdd\") " pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.165643 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvwdb\" (UniqueName: \"kubernetes.io/projected/a2e0a401-0fd7-499c-ac31-fc8cb0a64366-kube-api-access-hvwdb\") pod \"package-server-manager-789f6589d5-2jqvq\" (UID: \"a2e0a401-0fd7-499c-ac31-fc8cb0a64366\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.166562 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8699c7_58f5_4a80_b5af_5403cb178676.slice/crio-ac2733fb1a358b53d6cecdc04c18db6dd2ffab884268bd9f970b2082f8018667 WatchSource:0}: Error finding container ac2733fb1a358b53d6cecdc04c18db6dd2ffab884268bd9f970b2082f8018667: Status 404 returned error can't find the container with id ac2733fb1a358b53d6cecdc04c18db6dd2ffab884268bd9f970b2082f8018667 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.179148 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8wj\" (UniqueName: \"kubernetes.io/projected/511cd2d5-0160-44f2-adf1-acbe5c8c28cf-kube-api-access-qb8wj\") pod \"dns-default-5n768\" (UID: \"511cd2d5-0160-44f2-adf1-acbe5c8c28cf\") " pod="openshift-dns/dns-default-5n768" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.185960 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.203287 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbm6\" (UniqueName: \"kubernetes.io/projected/782d3fe9-7b5b-4d44-a3e6-0efea9d617ea-kube-api-access-bzbm6\") pod \"ingress-canary-4tvd9\" (UID: \"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea\") " pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.225766 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfr7v\" (UniqueName: \"kubernetes.io/projected/ac257861-33c1-4e92-9d58-bb7351f6316e-kube-api-access-gfr7v\") pod \"migrator-59844c95c7-5ltfz\" (UID: \"ac257861-33c1-4e92-9d58-bb7351f6316e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.237905 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.238619 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.738573634 +0000 UTC m=+231.728198103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.238823 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcnv\" (UniqueName: \"kubernetes.io/projected/a7389587-c14d-45bd-b642-4ba3b5d7ac41-kube-api-access-xdcnv\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9bck\" (UID: \"a7389587-c14d-45bd-b642-4ba3b5d7ac41\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.259262 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9d5\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-kube-api-access-7v9d5\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.259568 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-cxlfh"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.275451 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.275504 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.280750 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddjs\" (UniqueName: \"kubernetes.io/projected/5f4f79cd-ada9-4ec7-b779-94d97bdadc97-kube-api-access-2ddjs\") pod \"console-f9d7485db-zlnwh\" (UID: \"5f4f79cd-ada9-4ec7-b779-94d97bdadc97\") " pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.284246 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.299820 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.309689 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.315580 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cca2c2a-43ba-4b84-b10b-25053c6d7350-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w6pjq\" (UID: \"4cca2c2a-43ba-4b84-b10b-25053c6d7350\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.315719 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.320017 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.325268 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxwh\" (UniqueName: \"kubernetes.io/projected/42d75b69-be96-43de-8687-444a81d8ebd5-kube-api-access-bnxwh\") pod \"csi-hostpathplugin-dq9gz\" (UID: \"42d75b69-be96-43de-8687-444a81d8ebd5\") " pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.329063 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac86b042_947d_402f_a7c8_bb0a69d3f86e.slice/crio-3854cfb9a082003d28ae514db1807e177339f5296b54d5724d5dc75eba19a2ff WatchSource:0}: Error finding container 3854cfb9a082003d28ae514db1807e177339f5296b54d5724d5dc75eba19a2ff: Status 404 returned error can't find the container with id 3854cfb9a082003d28ae514db1807e177339f5296b54d5724d5dc75eba19a2ff Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.329685 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.334085 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.336229 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.338145 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thqj\" (UniqueName: \"kubernetes.io/projected/c15a2653-454b-42e4-85b5-87b99cc30198-kube-api-access-4thqj\") pod \"olm-operator-6b444d44fb-xs7nb\" (UID: \"c15a2653-454b-42e4-85b5-87b99cc30198\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.338564 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.338923 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.838908551 +0000 UTC m=+231.828533020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.346870 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.354742 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.362039 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3e3160a-60c7-424c-b5a3-53841213467d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-c94hb\" (UID: \"c3e3160a-60c7-424c-b5a3-53841213467d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.362357 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.370942 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4tvd9" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.383025 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54cdfa0a-fdb0-4509-9d56-01194a25ee63-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ghwq4\" (UID: \"54cdfa0a-fdb0-4509-9d56-01194a25ee63\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.407359 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.413680 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h7djl"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.426494 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.428895 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xfl22"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.429059 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5n768" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.432288 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.440171 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.440610 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:27.940592803 +0000 UTC m=+231.930217272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.453053 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14c3eec_796c_48b0_b4fe_67cb327f2de7.slice/crio-b75b4015cdaa1d2a42af15d31cc32c5412fd69494b8eaf3585095e889a5412d0 WatchSource:0}: Error finding container b75b4015cdaa1d2a42af15d31cc32c5412fd69494b8eaf3585095e889a5412d0: Status 404 returned error can't find the container with id b75b4015cdaa1d2a42af15d31cc32c5412fd69494b8eaf3585095e889a5412d0 Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.505609 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod836fad19_b7d1_434c_9fd8_faf3eb1d80d1.slice/crio-87e6107c078e7479ad93e38388a25c73c91e9c6c58b31e20dcb1b47002f12310 WatchSource:0}: Error finding container 87e6107c078e7479ad93e38388a25c73c91e9c6c58b31e20dcb1b47002f12310: Status 404 returned error can't find the container with id 87e6107c078e7479ad93e38388a25c73c91e9c6c58b31e20dcb1b47002f12310 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.541459 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.552430 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.554126 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.054090385 +0000 UTC m=+232.043714854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.554452 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.605280 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.619336 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.626837 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.643713 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.644388 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.144370299 +0000 UTC m=+232.133994768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.650705 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.690776 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p5tqf"] Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.712421 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a8a6d0_c052_4bca_9be8_dddd6d2ef017.slice/crio-69b8b9e353f95a2492ada4ea39468dc13589ceb0616170628777a130c44de6c3 WatchSource:0}: Error finding container 69b8b9e353f95a2492ada4ea39468dc13589ceb0616170628777a130c44de6c3: Status 404 returned error can't find the container with id 69b8b9e353f95a2492ada4ea39468dc13589ceb0616170628777a130c44de6c3 Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.744023 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f39cab4_77fc_4641_9e84_c01b0dedc300.slice/crio-f2b04c6997b421395ed405ee1136616fafacd3f6f41750220fb508cbe7531926 WatchSource:0}: Error finding container f2b04c6997b421395ed405ee1136616fafacd3f6f41750220fb508cbe7531926: Status 404 returned error can't find the container with id f2b04c6997b421395ed405ee1136616fafacd3f6f41750220fb508cbe7531926 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.747054 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.747823 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.247793396 +0000 UTC m=+232.237417865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.765075 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29531520-969xh"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.849356 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.849853 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.349836368 +0000 UTC m=+232.339460827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.868347 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f"] Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.885017 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" event={"ID":"ac86b042-947d-402f-a7c8-bb0a69d3f86e","Type":"ContainerStarted","Data":"3854cfb9a082003d28ae514db1807e177339f5296b54d5724d5dc75eba19a2ff"} Feb 24 00:09:27 crc kubenswrapper[4824]: W0224 00:09:27.885362 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf09bc4be_bc94_4c63_93ec_4bc2fef07d1b.slice/crio-4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43 WatchSource:0}: Error finding container 4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43: Status 404 returned error can't find the container with id 4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43 Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.886134 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" event={"ID":"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017","Type":"ContainerStarted","Data":"69b8b9e353f95a2492ada4ea39468dc13589ceb0616170628777a130c44de6c3"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.888246 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" event={"ID":"2f39cab4-77fc-4641-9e84-c01b0dedc300","Type":"ContainerStarted","Data":"f2b04c6997b421395ed405ee1136616fafacd3f6f41750220fb508cbe7531926"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.895330 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" event={"ID":"390f4e92-8639-45bb-b91c-a55773bfa293","Type":"ContainerStarted","Data":"82dc0e47877b1314fb573f81841d1421f5bbdd1583f5cff3bce8a90521be9640"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.901345 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" event={"ID":"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115","Type":"ContainerStarted","Data":"3b88c71c7f646381790daa9790f722b3637990f46735a62cbc3312b308a3ab9b"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.904380 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" event={"ID":"01ed973e-7ed7-41ec-bea9-69d8c86e19ed","Type":"ContainerStarted","Data":"6a84a663d942eb0b5a72d9e552d94ef02c3769dcfcad6ef19a67b74eca023607"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.911880 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" event={"ID":"13bff804-f118-473b-a547-433aed671b46","Type":"ContainerStarted","Data":"f3f6deb1282d355ebf00359716237c84d49ff6c7dfd3fbfcf4ed7963a90f8deb"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.924204 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r4c4b" event={"ID":"581e69ae-c21a-4a9e-b1ea-9c38256d7b30","Type":"ContainerStarted","Data":"deb3616bdfcc08678302c0e0617b53f7bdd5f57fee7e5facc2929f3b91c7322b"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.924264 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r4c4b" event={"ID":"581e69ae-c21a-4a9e-b1ea-9c38256d7b30","Type":"ContainerStarted","Data":"9b97a0a4486ae2cc63c665caa223ddb2e50e244e1359fe03d78a031900e54400"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.924769 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.926464 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" event={"ID":"f66ddecd-538b-48bd-a335-e7f99181daa0","Type":"ContainerStarted","Data":"79cda5a3afc87515700b0f0131cdecfbd8ca4df84513146d57fe549bf0db6cd6"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.928128 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" event={"ID":"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110","Type":"ContainerStarted","Data":"530327b92b782379bd097164c03bb9229bcf211e2c8a3fb1f8745b4be7f7b5ad"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.929765 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" event={"ID":"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b","Type":"ContainerStarted","Data":"d168e7b48bb45e2f3eeaabaaae34172927392199794b2e25a747dcf303c33d18"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.930927 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.931103 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.933593 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" event={"ID":"b14c3eec-796c-48b0-b4fe-67cb327f2de7","Type":"ContainerStarted","Data":"b75b4015cdaa1d2a42af15d31cc32c5412fd69494b8eaf3585095e889a5412d0"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.939534 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" event={"ID":"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f","Type":"ContainerStarted","Data":"bef25273eb059dd1ab72abd3983263df0ab7e759d96ada14df8ba73000a1f593"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.939585 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" event={"ID":"2595ad7b-d8cc-46d2-b0db-60e7ce636e9f","Type":"ContainerStarted","Data":"8932887d5d14bb497fe31bc893f05b1b68140314607684cd919a64d5341e6082"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.947897 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" event={"ID":"bb795a58-0029-416c-84fa-ae83cf338858","Type":"ContainerStarted","Data":"9b6f6c895fff8aaaed37faec3b49457fd54c815cfdb0c7145450f3fd7a32f97d"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.949741 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" event={"ID":"836fad19-b7d1-434c-9fd8-faf3eb1d80d1","Type":"ContainerStarted","Data":"87e6107c078e7479ad93e38388a25c73c91e9c6c58b31e20dcb1b47002f12310"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.950248 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:27 crc kubenswrapper[4824]: E0224 00:09:27.950736 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.4507151 +0000 UTC m=+232.440339569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.951461 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" event={"ID":"8118fe3c-1479-4634-9b64-9350991d909d","Type":"ContainerStarted","Data":"6c28fd392465c1155e5f0751f4879dd6b35f5e6f98909953be239d553f455816"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.952081 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" event={"ID":"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4","Type":"ContainerStarted","Data":"98f7f928cc84639f65f90fdefcf2fd2fbae904cbd165407fbf7d9753ff24601b"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.953057 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" event={"ID":"6f8699c7-58f5-4a80-b5af-5403cb178676","Type":"ContainerStarted","Data":"ac2733fb1a358b53d6cecdc04c18db6dd2ffab884268bd9f970b2082f8018667"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.953821 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" event={"ID":"4349e271-7758-4dcf-9053-fbc984436a8b","Type":"ContainerStarted","Data":"1a1314c923a76d17830d326aab8824c565167be258b1dd220e62acb8d1e4eb68"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.954439 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" event={"ID":"250422bb-6e8f-4622-a456-ded5825e7c86","Type":"ContainerStarted","Data":"621b31c4a6ab74d4e0dd43cf066f7ee99350da0fe63058e5278375c53595a9f3"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.958317 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vvlvv" event={"ID":"757dc1d0-9507-4470-8496-9162b8999465","Type":"ContainerStarted","Data":"9cd2b91af806441c9262996df56b5b14e5883d9bdae0028c3be7a8f609baf215"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.959905 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" event={"ID":"53344821-2f26-459a-9e42-003f3f1b5a87","Type":"ContainerStarted","Data":"e2e6496dd9b65bd1a5092c51c904c096168802f1d2367b1a5b29b292de68f360"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.963252 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fp4wq" event={"ID":"5b0ff99f-1e04-4e23-895a-a02a303c8daa","Type":"ContainerStarted","Data":"8a04e1120c09a51d0bfbf418056c056935cda72eb55f40882a439cb22de54ad2"} Feb 24 00:09:27 crc kubenswrapper[4824]: I0224 00:09:27.963319 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fp4wq" event={"ID":"5b0ff99f-1e04-4e23-895a-a02a303c8daa","Type":"ContainerStarted","Data":"bbdcb3c8f2541317f6a10dae8c71fbfc2f6357afb4cd3a5b35094b0739aa7bdb"} Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.025674 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.055971 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.056343 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.556327865 +0000 UTC m=+232.545952334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.087803 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.128383 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.155251 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6h296"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.157527 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.158095 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.658072789 +0000 UTC m=+232.647697258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.181593 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5n768"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.216926 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.260273 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.260748 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.760728326 +0000 UTC m=+232.750352795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.370365 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.372994 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.872971695 +0000 UTC m=+232.862596164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.397713 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.424062 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.462014 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8krrp"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.474797 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.475252 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:28.975236163 +0000 UTC m=+232.964860632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.576170 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.576645 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.076620587 +0000 UTC m=+233.066245056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: W0224 00:09:28.652842 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6c8d25_266d_4f40_9b7c_e1697b87db51.slice/crio-cdcdb1571f7e1faf1e4a1c617eb8f73363fd3c5be2798a32b2b3ec63648683db WatchSource:0}: Error finding container cdcdb1571f7e1faf1e4a1c617eb8f73363fd3c5be2798a32b2b3ec63648683db: Status 404 returned error can't find the container with id cdcdb1571f7e1faf1e4a1c617eb8f73363fd3c5be2798a32b2b3ec63648683db Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.680563 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.681315 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.181272877 +0000 UTC m=+233.170897526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: W0224 00:09:28.725063 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b600b4_d056_4b5f_b75e_0502de432461.slice/crio-603f42335849da53e1c1b5dfc8f5b1ff9f6569fbac08f8c6d978eaa1f0ede865 WatchSource:0}: Error finding container 603f42335849da53e1c1b5dfc8f5b1ff9f6569fbac08f8c6d978eaa1f0ede865: Status 404 returned error can't find the container with id 603f42335849da53e1c1b5dfc8f5b1ff9f6569fbac08f8c6d978eaa1f0ede865 Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.728766 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.783164 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.783962 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.283938645 +0000 UTC m=+233.273563114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.893151 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.895932 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.393507054 +0000 UTC m=+233.383131523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.898365 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:28 crc kubenswrapper[4824]: W0224 00:09:28.905238 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb0a4d10_0131_49ec_97f5_e77f1f222cdd.slice/crio-24f1d15c6dfa786e261f3dfad2f6f8184b4edad621931cf5ea95c0b98429c259 WatchSource:0}: Error finding container 24f1d15c6dfa786e261f3dfad2f6f8184b4edad621931cf5ea95c0b98429c259: Status 404 returned error can't find the container with id 24f1d15c6dfa786e261f3dfad2f6f8184b4edad621931cf5ea95c0b98429c259 Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.905575 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.905621 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 24 00:09:28 crc kubenswrapper[4824]: W0224 00:09:28.955140 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac257861_33c1_4e92_9d58_bb7351f6316e.slice/crio-576206ef4cd543d1284208fe093beda1ae91b3259cc7628242b381f11e37105c WatchSource:0}: Error finding container 576206ef4cd543d1284208fe093beda1ae91b3259cc7628242b381f11e37105c: Status 404 returned error can't find the container with id 576206ef4cd543d1284208fe093beda1ae91b3259cc7628242b381f11e37105c Feb 24 00:09:28 crc kubenswrapper[4824]: W0224 00:09:28.972554 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode312a49f_dc7a_49fc_9baf_3105fec587ae.slice/crio-1e7b695fbb51788dd119d9e0ae76024be2038ddb563a00cf87c9d5c4544df61f WatchSource:0}: Error finding container 1e7b695fbb51788dd119d9e0ae76024be2038ddb563a00cf87c9d5c4544df61f: Status 404 returned error can't find the container with id 1e7b695fbb51788dd119d9e0ae76024be2038ddb563a00cf87c9d5c4544df61f Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.977466 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n768" event={"ID":"511cd2d5-0160-44f2-adf1-acbe5c8c28cf","Type":"ContainerStarted","Data":"e43f346005a2d3ccf7b602123427aa11d0ed8ad48abf7d286256ba72ef490e6f"} Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.982573 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" event={"ID":"5d3de326-8359-4a7c-84da-57a071a929d7","Type":"ContainerStarted","Data":"de1cdc762d748da1200dd4014fc8eb8bd10989af0ebeebbb6cf5cdb6d9e440ce"} Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.986201 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" event={"ID":"6f8699c7-58f5-4a80-b5af-5403cb178676","Type":"ContainerStarted","Data":"fc75bfe4b562302aad30993aa2a68489589d802790238c3eeef171430ffcd747"} Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.987239 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.992814 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" event={"ID":"c3e3160a-60c7-424c-b5a3-53841213467d","Type":"ContainerStarted","Data":"a573ee6ab5f7e155f1846057ab13689cdd9186f5eda034467c0aadf412cd950e"} Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.993296 4824 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jf5jw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.993345 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.993736 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.994057 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.494037936 +0000 UTC m=+233.483662405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.994202 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:28 crc kubenswrapper[4824]: E0224 00:09:28.994549 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.494540359 +0000 UTC m=+233.484164828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:28 crc kubenswrapper[4824]: I0224 00:09:28.995841 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" event={"ID":"d0c9bb52-a40f-4e1d-ba3d-fbb14eb74110","Type":"ContainerStarted","Data":"1a255e62aab2f5564fe5ff78c6845daa8f4591ba62c038ccf2871069a11dc2b4"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.000654 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" event={"ID":"239fc97c-cb5a-4fa1-965e-7b64c90268ce","Type":"ContainerStarted","Data":"4d0da2c3da00c6dd6cf100ba43dd4048f42c65ed90df04df6d04e96db17f2c53"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.004617 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" event={"ID":"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4","Type":"ContainerStarted","Data":"69873e9d9e5bc06a933ebde7f766baf24b81f7e550ee3aca2edb4f71f6eb32ac"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.004677 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29531520-969xh" event={"ID":"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b","Type":"ContainerStarted","Data":"4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.006397 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" event={"ID":"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b","Type":"ContainerStarted","Data":"b81e27c4f382da70441b9aeabc639347240cb4f01fe8b81ba75c60688c944968"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.006934 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.017647 4824 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9k27r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.017687 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.057605 4824 generic.go:334] "Generic (PLEG): container finished" podID="01ed973e-7ed7-41ec-bea9-69d8c86e19ed" containerID="91ed045e95f66b7afb1a6cf84984b9da73731bffb60c54002e215cbe84b34c12" exitCode=0 Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.057783 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" event={"ID":"01ed973e-7ed7-41ec-bea9-69d8c86e19ed","Type":"ContainerDied","Data":"91ed045e95f66b7afb1a6cf84984b9da73731bffb60c54002e215cbe84b34c12"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.095984 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.100733 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.600683828 +0000 UTC m=+233.590308297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.109778 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" event={"ID":"44376c4d-d433-41a2-bdc1-22a9792e7640","Type":"ContainerStarted","Data":"b26d2427ecad64c84ce29d1b0b8b771dd0b20b42de4768633a17bcd1d64a93f9"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.120244 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" event={"ID":"53344821-2f26-459a-9e42-003f3f1b5a87","Type":"ContainerStarted","Data":"416937b951cda54739a4849ddf48f54f63846aeb5fcdf83281d98f16d49cb1a5"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.125137 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" event={"ID":"a7389587-c14d-45bd-b642-4ba3b5d7ac41","Type":"ContainerStarted","Data":"90172e64e033d046b55bfd59bd0d4866270173465c33f5995b3a5117cdf52d53"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.127966 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" event={"ID":"cb0a4d10-0131-49ec-97f5-e77f1f222cdd","Type":"ContainerStarted","Data":"24f1d15c6dfa786e261f3dfad2f6f8184b4edad621931cf5ea95c0b98429c259"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.138319 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" event={"ID":"fc6c8d25-266d-4f40-9b7c-e1697b87db51","Type":"ContainerStarted","Data":"cdcdb1571f7e1faf1e4a1c617eb8f73363fd3c5be2798a32b2b3ec63648683db"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.141931 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" event={"ID":"67b600b4-d056-4b5f-b75e-0502de432461","Type":"ContainerStarted","Data":"603f42335849da53e1c1b5dfc8f5b1ff9f6569fbac08f8c6d978eaa1f0ede865"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.145853 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" event={"ID":"a2e0a401-0fd7-499c-ac31-fc8cb0a64366","Type":"ContainerStarted","Data":"5d56999380ffc886b9f74be0d31ebfa6f8bff94423f32fdfacad2a5b80cac8ac"} Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.147720 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.147796 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.151472 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dq9gz"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.189767 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-r4c4b" podStartSLOduration=159.18974776 podStartE2EDuration="2m39.18974776s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.181549215 +0000 UTC m=+233.171173694" watchObservedRunningTime="2026-02-24 00:09:29.18974776 +0000 UTC m=+233.179372229" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.204706 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.207047 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.707032383 +0000 UTC m=+233.696656852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.261771 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m7jnm" podStartSLOduration=160.261745775 podStartE2EDuration="2m40.261745775s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.254333351 +0000 UTC m=+233.243957820" watchObservedRunningTime="2026-02-24 00:09:29.261745775 +0000 UTC m=+233.251370244" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.294169 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.297457 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fp4wq" podStartSLOduration=159.297416729 podStartE2EDuration="2m39.297416729s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.295396636 +0000 UTC m=+233.285021105" watchObservedRunningTime="2026-02-24 00:09:29.297416729 +0000 UTC m=+233.287041198" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.307287 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.308969 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.808946491 +0000 UTC m=+233.798570960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.325738 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.332305 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zlnwh"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.381216 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.394663 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" podStartSLOduration=160.394635634 podStartE2EDuration="2m40.394635634s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.3471209 +0000 UTC m=+233.336745369" watchObservedRunningTime="2026-02-24 00:09:29.394635634 +0000 UTC m=+233.384260103" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.398279 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pvqdd" podStartSLOduration=159.398251119 podStartE2EDuration="2m39.398251119s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.372647819 +0000 UTC m=+233.362272298" watchObservedRunningTime="2026-02-24 00:09:29.398251119 +0000 UTC m=+233.387875588" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.398713 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4tvd9"] Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.410559 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.411069 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:29.911054684 +0000 UTC m=+233.900679153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.460507 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" podStartSLOduration=159.460487359 podStartE2EDuration="2m39.460487359s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.456051933 +0000 UTC m=+233.445676402" watchObservedRunningTime="2026-02-24 00:09:29.460487359 +0000 UTC m=+233.450111828" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.463706 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7vdck" podStartSLOduration=160.463697723 podStartE2EDuration="2m40.463697723s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:29.421691163 +0000 UTC m=+233.411315652" watchObservedRunningTime="2026-02-24 00:09:29.463697723 +0000 UTC m=+233.453322202" Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.511380 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.512072 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.012026298 +0000 UTC m=+234.001650767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.613947 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.614414 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.114391748 +0000 UTC m=+234.104016387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.715146 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.716047 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.21603011 +0000 UTC m=+234.205654579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.818351 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.818815 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.31880204 +0000 UTC m=+234.308426509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.919982 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.920035 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.42000509 +0000 UTC m=+234.409629559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.920599 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:29 crc kubenswrapper[4824]: E0224 00:09:29.921001 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.420985996 +0000 UTC m=+234.410610465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.937755 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:29 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:29 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:29 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:29 crc kubenswrapper[4824]: I0224 00:09:29.937866 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.021735 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.023072 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.523043348 +0000 UTC m=+234.512667817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.128944 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.129985 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.629609998 +0000 UTC m=+234.619234467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.230211 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.230684 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.730650814 +0000 UTC m=+234.720275283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.230872 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.231255 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.731246539 +0000 UTC m=+234.720871208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.234493 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" event={"ID":"44376c4d-d433-41a2-bdc1-22a9792e7640","Type":"ContainerStarted","Data":"99abc91f2913a10fef9474c1671213d7ae90cee98647992dc4c213a7e1946ca3"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.238314 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" event={"ID":"a2e0a401-0fd7-499c-ac31-fc8cb0a64366","Type":"ContainerStarted","Data":"9f135bd87acd59182b077577cc14922f88a1e7d927ad4c74ec069a5ec970efae"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.239322 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" event={"ID":"4cca2c2a-43ba-4b84-b10b-25053c6d7350","Type":"ContainerStarted","Data":"f72ae1a2e885ada97661c5a55ce0b906a0a7cc454e49a0034e9a4aa71134c651"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.240430 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" event={"ID":"250422bb-6e8f-4622-a456-ded5825e7c86","Type":"ContainerStarted","Data":"0d4991af3b6e9acf7e6d057cb38170c4c4c5e2bc88353af45ee41de342133baf"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.242804 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" event={"ID":"a4a8a6d0-c052-4bca-9be8-dddd6d2ef017","Type":"ContainerStarted","Data":"aec93aaf970240a924b8e66b02bbe366e1193ded0ca6679ff5d203cf347100ea"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.243282 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.246294 4824 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rvv9j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.246370 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" podUID="a4a8a6d0-c052-4bca-9be8-dddd6d2ef017" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.246715 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" event={"ID":"2f39cab4-77fc-4641-9e84-c01b0dedc300","Type":"ContainerStarted","Data":"2f01bda5b5764a5debd700f27975f9deec4ae0d662331f19e9e91caf8f764f9d"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.247083 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.248501 4824 patch_prober.go:28] interesting pod/console-operator-58897d9998-p5tqf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.248564 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" podUID="2f39cab4-77fc-4641-9e84-c01b0dedc300" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.249615 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" event={"ID":"8118fe3c-1479-4634-9b64-9350991d909d","Type":"ContainerStarted","Data":"c96a3b6f872d0766c2756665e0f66c8eac91b9123623bf1c47ba4d185db1b859"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.253538 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" event={"ID":"13bff804-f118-473b-a547-433aed671b46","Type":"ContainerStarted","Data":"30a0cc68b0a9066eb90264fe3ae3b8d4863e3b99d2e08267535de907d2363859"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.261950 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" event={"ID":"bb795a58-0029-416c-84fa-ae83cf338858","Type":"ContainerStarted","Data":"0aa26fc0f3a9ea0792414af9e37e6f9705ebdd80ae567d265f309d783d7efa2a"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.265971 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" event={"ID":"239fc97c-cb5a-4fa1-965e-7b64c90268ce","Type":"ContainerStarted","Data":"594ea7953af708dc6eec520d0cd46b08f1c6126425d4ad263d064dfe050100f2"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.272081 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lwm9v" podStartSLOduration=160.272060988 podStartE2EDuration="2m40.272060988s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.271890994 +0000 UTC m=+234.261515483" watchObservedRunningTime="2026-02-24 00:09:30.272060988 +0000 UTC m=+234.261685457" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.292645 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" event={"ID":"e312a49f-dc7a-49fc-9baf-3105fec587ae","Type":"ContainerStarted","Data":"1f7f84523e39d2e74db2895c5b1819295512a987f6083e74a45c4c25f78e706d"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.292955 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" event={"ID":"e312a49f-dc7a-49fc-9baf-3105fec587ae","Type":"ContainerStarted","Data":"1e7b695fbb51788dd119d9e0ae76024be2038ddb563a00cf87c9d5c4544df61f"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.293275 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.296340 4824 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-99tkw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.296389 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.302865 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" event={"ID":"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115","Type":"ContainerStarted","Data":"5b26b0b907a714e0eb8fa1b65c28de4396d3dfbda124fc0f95f4d779730bf39c"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.304117 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.306144 4824 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jm7qk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.306501 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.308728 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" podStartSLOduration=160.308718298 podStartE2EDuration="2m40.308718298s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.289752501 +0000 UTC m=+234.279376990" watchObservedRunningTime="2026-02-24 00:09:30.308718298 +0000 UTC m=+234.298342757" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.311253 4824 generic.go:334] "Generic (PLEG): container finished" podID="f66ddecd-538b-48bd-a335-e7f99181daa0" containerID="f0f0a604c0fb7850469e77058191d27cb2fd20f4ef1e3681ea96e8b7c8d50a7f" exitCode=0 Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.311984 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" event={"ID":"f66ddecd-538b-48bd-a335-e7f99181daa0","Type":"ContainerDied","Data":"f0f0a604c0fb7850469e77058191d27cb2fd20f4ef1e3681ea96e8b7c8d50a7f"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.325201 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" event={"ID":"ac86b042-947d-402f-a7c8-bb0a69d3f86e","Type":"ContainerStarted","Data":"ccb595331655ddf41e2d0ab8c44143a7e4f210527b1e8e9b9ea1629f52f44c0a"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.327968 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" event={"ID":"fc6c8d25-266d-4f40-9b7c-e1697b87db51","Type":"ContainerStarted","Data":"56f82db8dcee9fbb39506708eee0556997af7585f1ce2e526ee4589a83c6d9d1"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.330605 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" event={"ID":"5d3de326-8359-4a7c-84da-57a071a929d7","Type":"ContainerStarted","Data":"c881de585367ac2c53d1b5535bb3d04d188de0a29617c85c95b37cd9b4aeec95"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.331315 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.334364 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.834337369 +0000 UTC m=+234.823962028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.347266 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8dlbn" podStartSLOduration=160.347229736 podStartE2EDuration="2m40.347229736s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.309634472 +0000 UTC m=+234.299258951" watchObservedRunningTime="2026-02-24 00:09:30.347229736 +0000 UTC m=+234.336854215" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.355848 4824 generic.go:334] "Generic (PLEG): container finished" podID="836fad19-b7d1-434c-9fd8-faf3eb1d80d1" containerID="33e1400531edf7e2332b0284ef1eea808c68ad473f62614215212f0f5dbd1985" exitCode=0 Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.355936 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" event={"ID":"836fad19-b7d1-434c-9fd8-faf3eb1d80d1","Type":"ContainerDied","Data":"33e1400531edf7e2332b0284ef1eea808c68ad473f62614215212f0f5dbd1985"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.363227 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vvlvv" event={"ID":"757dc1d0-9507-4470-8496-9162b8999465","Type":"ContainerStarted","Data":"b83094e79c3ff6fccf0589bfe884a052268f987cc9a433c1d16cc35c9027f576"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.372638 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" podStartSLOduration=160.372612511 podStartE2EDuration="2m40.372612511s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.360964236 +0000 UTC m=+234.350588705" watchObservedRunningTime="2026-02-24 00:09:30.372612511 +0000 UTC m=+234.362236990" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.382421 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" podStartSLOduration=161.382403047 podStartE2EDuration="2m41.382403047s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.336011062 +0000 UTC m=+234.325635561" watchObservedRunningTime="2026-02-24 00:09:30.382403047 +0000 UTC m=+234.372027526" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.391596 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" event={"ID":"a7389587-c14d-45bd-b642-4ba3b5d7ac41","Type":"ContainerStarted","Data":"a614ea537267f662a11e1f4e8d0541e96b3d72d0802ce9a2d96ff8aeaaca8b4d"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.398499 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-q8hvw" podStartSLOduration=160.398480048 podStartE2EDuration="2m40.398480048s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.396422404 +0000 UTC m=+234.386046883" watchObservedRunningTime="2026-02-24 00:09:30.398480048 +0000 UTC m=+234.388104517" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.420758 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" podStartSLOduration=160.420732581 podStartE2EDuration="2m40.420732581s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.42030612 +0000 UTC m=+234.409930579" watchObservedRunningTime="2026-02-24 00:09:30.420732581 +0000 UTC m=+234.410357050" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.429240 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" event={"ID":"cb0a4d10-0131-49ec-97f5-e77f1f222cdd","Type":"ContainerStarted","Data":"488579b043a6667caf4583cad8f042d6fa78b1ae65d09ae9fa6aecb15bb321b8"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.442972 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.443979 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" event={"ID":"b14c3eec-796c-48b0-b4fe-67cb327f2de7","Type":"ContainerStarted","Data":"278ff91e5756c4337eeb078485e2880497cc0a4aec17577a57629a7131766cce"} Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.445337 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:30.945321275 +0000 UTC m=+234.934945744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.469230 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4tvd9" event={"ID":"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea","Type":"ContainerStarted","Data":"6dfc0c52c93437840e6a1b1efb9329928f54cfd73337fbdd42992283f6ab0e30"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.485914 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9bck" podStartSLOduration=160.485888357 podStartE2EDuration="2m40.485888357s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.479386196 +0000 UTC m=+234.469010685" watchObservedRunningTime="2026-02-24 00:09:30.485888357 +0000 UTC m=+234.475512826" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.504062 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" event={"ID":"ac257861-33c1-4e92-9d58-bb7351f6316e","Type":"ContainerStarted","Data":"576206ef4cd543d1284208fe093beda1ae91b3259cc7628242b381f11e37105c"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.511749 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" event={"ID":"54cdfa0a-fdb0-4509-9d56-01194a25ee63","Type":"ContainerStarted","Data":"40adc91feaed59d35fce2485124eb78147af78e7e249f09a95645b0bcff39f23"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.517613 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29531520-969xh" event={"ID":"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b","Type":"ContainerStarted","Data":"6b2ac39d85326d80c4e57096bd6873f9064eac38a27f5eddf04bd260901e4edf"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.521839 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlnwh" event={"ID":"5f4f79cd-ada9-4ec7-b779-94d97bdadc97","Type":"ContainerStarted","Data":"11795379a86e024e8f5b6ad9e9cbdb8472c4d324b7b1c3b1671d4ddf8d54e1cb"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.523364 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" event={"ID":"42d75b69-be96-43de-8687-444a81d8ebd5","Type":"ContainerStarted","Data":"682e056d710c3e51c0158c29dc5d66e10c283291a81afb4e4ab4cc742bfec0fc"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.528557 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" event={"ID":"33028c7d-6bd8-46bb-a7d9-e1c46b0a26e4","Type":"ContainerStarted","Data":"a36a25639fb2445cd75d2d2db55ddd1407eaca6928184e9027fcf8f590aa73df"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.530017 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" event={"ID":"c15a2653-454b-42e4-85b5-87b99cc30198","Type":"ContainerStarted","Data":"52a7ea0db83a5faab3d8ee5297e5291b0a41c601ee08c66bc2f4dddc021db8c1"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.539636 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" event={"ID":"4349e271-7758-4dcf-9053-fbc984436a8b","Type":"ContainerStarted","Data":"1cb734d300d3a94c1fb6999fd12350fe2c76721ac4830bd6dc509bab9b28b256"} Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.540404 4824 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jf5jw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.540478 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.541669 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.542074 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.544543 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.544892 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.044867591 +0000 UTC m=+235.034492060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.545845 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.548192 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.048164807 +0000 UTC m=+235.037789276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.550336 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-b2k2f" podStartSLOduration=160.550303573 podStartE2EDuration="2m40.550303573s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.538986627 +0000 UTC m=+234.528611106" watchObservedRunningTime="2026-02-24 00:09:30.550303573 +0000 UTC m=+234.539928042" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.552391 4824 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9k27r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.552466 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.601906 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" podStartSLOduration=160.601885344 podStartE2EDuration="2m40.601885344s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.561749383 +0000 UTC m=+234.551373852" watchObservedRunningTime="2026-02-24 00:09:30.601885344 +0000 UTC m=+234.591509813" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.605166 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vvlvv" podStartSLOduration=6.605153239 podStartE2EDuration="6.605153239s" podCreationTimestamp="2026-02-24 00:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.601778051 +0000 UTC m=+234.591402530" watchObservedRunningTime="2026-02-24 00:09:30.605153239 +0000 UTC m=+234.594777718" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.632011 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hl2rv" podStartSLOduration=160.631985592 podStartE2EDuration="2m40.631985592s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.631801357 +0000 UTC m=+234.621425846" watchObservedRunningTime="2026-02-24 00:09:30.631985592 +0000 UTC m=+234.621610061" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.649708 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.649931 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.149885341 +0000 UTC m=+235.139509820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.655227 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.655724 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.155708173 +0000 UTC m=+235.145332632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.672098 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8krrp" podStartSLOduration=160.672073942 podStartE2EDuration="2m40.672073942s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.654172673 +0000 UTC m=+234.643797162" watchObservedRunningTime="2026-02-24 00:09:30.672073942 +0000 UTC m=+234.661698421" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.675713 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29531520-969xh" podStartSLOduration=161.675699686 podStartE2EDuration="2m41.675699686s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:30.67162824 +0000 UTC m=+234.661252719" watchObservedRunningTime="2026-02-24 00:09:30.675699686 +0000 UTC m=+234.665324155" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.756855 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.757060 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.257029396 +0000 UTC m=+235.246653865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.757222 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.757818 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.257809926 +0000 UTC m=+235.247434395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.857956 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.858246 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.358215185 +0000 UTC m=+235.347839664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.858429 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.858821 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.358812661 +0000 UTC m=+235.348437130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.901498 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:30 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:30 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:30 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.902041 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:30 crc kubenswrapper[4824]: I0224 00:09:30.960508 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:30 crc kubenswrapper[4824]: E0224 00:09:30.961091 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.461050328 +0000 UTC m=+235.450674797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.062585 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.063082 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.563061149 +0000 UTC m=+235.552685618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.163884 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.164129 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.664087804 +0000 UTC m=+235.653712273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.164183 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.164749 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.664739961 +0000 UTC m=+235.654364430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.265446 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.265715 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.765677764 +0000 UTC m=+235.755302233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.266168 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.266565 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.766555367 +0000 UTC m=+235.756179836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.367159 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.367622 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.867557741 +0000 UTC m=+235.857182270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.469631 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.470717 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:31.970698402 +0000 UTC m=+235.960322871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.550406 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" event={"ID":"b14c3eec-796c-48b0-b4fe-67cb327f2de7","Type":"ContainerStarted","Data":"66df04767ad299bd1444bcd83a55f2280009eb8ca6a2e21f1daa8ca3500b808d"} Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551079 4824 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-99tkw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551124 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551325 4824 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rvv9j container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551375 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" podUID="a4a8a6d0-c052-4bca-9be8-dddd6d2ef017" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551325 4824 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jm7qk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551436 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551617 4824 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jf5jw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.551646 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.552049 4824 patch_prober.go:28] interesting pod/console-operator-58897d9998-p5tqf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.552159 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" podUID="2f39cab4-77fc-4641-9e84-c01b0dedc300" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.571887 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.572364 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.072343353 +0000 UTC m=+236.061967832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.578467 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" podStartSLOduration=161.578451573 podStartE2EDuration="2m41.578451573s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:31.575073665 +0000 UTC m=+235.564698134" watchObservedRunningTime="2026-02-24 00:09:31.578451573 +0000 UTC m=+235.568076072" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.678647 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.679115 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.179090618 +0000 UTC m=+236.168715127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.782295 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.782987 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.282944757 +0000 UTC m=+236.272569226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.884034 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.884466 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.384450625 +0000 UTC m=+236.374075094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.908486 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:31 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:31 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:31 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.908591 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.985430 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.985707 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.485665125 +0000 UTC m=+236.475289594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:31 crc kubenswrapper[4824]: I0224 00:09:31.985791 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:31 crc kubenswrapper[4824]: E0224 00:09:31.986215 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.486197369 +0000 UTC m=+236.475821848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.087225 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.087385 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.587354898 +0000 UTC m=+236.576979377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.088005 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.088444 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.588436636 +0000 UTC m=+236.578061105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.189626 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.189829 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.689802479 +0000 UTC m=+236.679426958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.190021 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.190376 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.690365084 +0000 UTC m=+236.679989553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.292645 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.292811 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.792782925 +0000 UTC m=+236.782407394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.292971 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.293379 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.793367801 +0000 UTC m=+236.782992270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.394621 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.395254 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.895237168 +0000 UTC m=+236.884861627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.496828 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.497402 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:32.997373932 +0000 UTC m=+236.986998581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.565940 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" event={"ID":"836fad19-b7d1-434c-9fd8-faf3eb1d80d1","Type":"ContainerStarted","Data":"1b6eef4295149ffe56c232ba193db1b543ea313748e795c0c178e010dfcfcb6c"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.568578 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" event={"ID":"8118fe3c-1479-4634-9b64-9350991d909d","Type":"ContainerStarted","Data":"e5992154328924a0401a820d6dad00bd43a7e667fb2550eca5fc7b3d69e96618"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.570635 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" event={"ID":"ac257861-33c1-4e92-9d58-bb7351f6316e","Type":"ContainerStarted","Data":"ecde68be0b336e882d427864e1dfa4386122dc37a0c4f9beae52cff5d43902a6"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.572016 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" event={"ID":"4cca2c2a-43ba-4b84-b10b-25053c6d7350","Type":"ContainerStarted","Data":"2ae757ea979a7ac019bb332bc89653a5714f9b00d5796260819aae7e0428b02f"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.573454 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n768" event={"ID":"511cd2d5-0160-44f2-adf1-acbe5c8c28cf","Type":"ContainerStarted","Data":"ec72dee84d18c234baaccba6cf3679b1174eff33b9ced0b7b28f253c0c80a43e"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.574828 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zlnwh" event={"ID":"5f4f79cd-ada9-4ec7-b779-94d97bdadc97","Type":"ContainerStarted","Data":"3dbc8a0425a8303cb08f42b8f408754fe6fb9b00ab2e049ad49dd4c9ef81e736"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.576571 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" event={"ID":"01ed973e-7ed7-41ec-bea9-69d8c86e19ed","Type":"ContainerStarted","Data":"76597979f779da55dc760372f9bfeeec71c0717a041440cee55acfa0e56598b2"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.578893 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" event={"ID":"c3e3160a-60c7-424c-b5a3-53841213467d","Type":"ContainerStarted","Data":"29ae1f2b3deee60e623dca5b325a17e8dcc7f86420b3a5605bec9d7d65816185"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.580471 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" event={"ID":"67b600b4-d056-4b5f-b75e-0502de432461","Type":"ContainerStarted","Data":"c75b9acd0e6219277e1ed8cdeac6bc1c61b521997b9fb4d347a6fda7ba8b1f52"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.582728 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" event={"ID":"ac86b042-947d-402f-a7c8-bb0a69d3f86e","Type":"ContainerStarted","Data":"ef7da6be8daeff3b4ee3adeb6ad4fdc1236a2d3702c75102fc84a3334d09e95d"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.585285 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" event={"ID":"53344821-2f26-459a-9e42-003f3f1b5a87","Type":"ContainerStarted","Data":"bb408f9716b85b0696c904977b4c9810441237ecfb99dda99024bcc78be24752"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.588009 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" event={"ID":"a2e0a401-0fd7-499c-ac31-fc8cb0a64366","Type":"ContainerStarted","Data":"1fe87021265dbd7c6409775966557700181ff3808a6e5e56b264250598564bcc"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.590866 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" event={"ID":"f66ddecd-538b-48bd-a335-e7f99181daa0","Type":"ContainerStarted","Data":"ee640a5eba3acb5836b6e55a2210f8f9e73aeeb6b9711c79c0bdef1c3f9fb7b6"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.594755 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" event={"ID":"4349e271-7758-4dcf-9053-fbc984436a8b","Type":"ContainerStarted","Data":"717c6aab294c093e72a42685730d12eb9b72c40bb4cd1218cac66c674982e359"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.597770 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.598123 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.098038218 +0000 UTC m=+237.087662777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.598785 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.598899 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" event={"ID":"44376c4d-d433-41a2-bdc1-22a9792e7640","Type":"ContainerStarted","Data":"05abb37f3bcb860ede06a3342411c0de6204f9dfcbc0bd6880e1f021469ed20a"} Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.599236 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.099216829 +0000 UTC m=+237.088841298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.600895 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" event={"ID":"c15a2653-454b-42e4-85b5-87b99cc30198","Type":"ContainerStarted","Data":"6fd95aec2574f28f3bc3f804569fbe48cd52370c2ad0f28f3a01b83682176761"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.602356 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4tvd9" event={"ID":"782d3fe9-7b5b-4d44-a3e6-0efea9d617ea","Type":"ContainerStarted","Data":"cc79de462b5a5b9e6de23c9332363aaa2ead5c5d34eddcfbb2be9e7e0cef65ce"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.605654 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" event={"ID":"54cdfa0a-fdb0-4509-9d56-01194a25ee63","Type":"ContainerStarted","Data":"4ca6cf6c4ebab436114b68df61d02fb23d08e62e532d3c479d8cbfefd9da8122"} Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.606384 4824 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jm7qk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.606417 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.628888 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ksmk7" podStartSLOduration=162.628864585 podStartE2EDuration="2m42.628864585s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:32.622858578 +0000 UTC m=+236.612483077" watchObservedRunningTime="2026-02-24 00:09:32.628864585 +0000 UTC m=+236.618489064" Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.704455 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.706182 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.206158929 +0000 UTC m=+237.195783398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.806737 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.807289 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.307266956 +0000 UTC m=+237.296891425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.903233 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:32 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:32 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:32 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.903309 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:32 crc kubenswrapper[4824]: I0224 00:09:32.908264 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:32 crc kubenswrapper[4824]: E0224 00:09:32.908693 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.408678891 +0000 UTC m=+237.398303360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.010612 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.011161 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.511132274 +0000 UTC m=+237.500756753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.112675 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.112881 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.612852167 +0000 UTC m=+237.602476646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.113456 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.113972 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.613950736 +0000 UTC m=+237.603575195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.215539 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.215826 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.715787152 +0000 UTC m=+237.705411661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.216161 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.216790 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.716766228 +0000 UTC m=+237.706390727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.317108 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.317628 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.817567907 +0000 UTC m=+237.807192426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.419571 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.420088 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:33.9200417 +0000 UTC m=+237.909666169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.525594 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.526346 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.026305383 +0000 UTC m=+238.015929852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.527265 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.527693 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.027681319 +0000 UTC m=+238.017305788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.628492 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.628865 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.128819007 +0000 UTC m=+238.118443516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.629287 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.629647 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.129631968 +0000 UTC m=+238.119256437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.739718 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.739931 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.239888565 +0000 UTC m=+238.229513034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.740217 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.740692 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.240683936 +0000 UTC m=+238.230308405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.842120 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.842386 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.342334427 +0000 UTC m=+238.331958906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.842903 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.843302 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.343285302 +0000 UTC m=+238.332909771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.901078 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:33 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:33 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:33 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.901215 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.943834 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.944163 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.444099492 +0000 UTC m=+238.433723961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:33 crc kubenswrapper[4824]: I0224 00:09:33.944354 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:33 crc kubenswrapper[4824]: E0224 00:09:33.944867 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.444845391 +0000 UTC m=+238.434469880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.046061 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.046311 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.546265347 +0000 UTC m=+238.535889826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.046694 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.047082 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.547067028 +0000 UTC m=+238.536691497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.148308 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.148665 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.648620947 +0000 UTC m=+238.638245426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.148998 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.149565 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.649553291 +0000 UTC m=+238.639177810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.249904 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.250373 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.75035591 +0000 UTC m=+238.739980379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.391206 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.392066 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.89204874 +0000 UTC m=+238.881673209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.493640 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.495875 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:34.995848588 +0000 UTC m=+238.985473057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.595627 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.596136 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.096119283 +0000 UTC m=+239.085743752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.650287 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-cxlfh" podStartSLOduration=164.650258531 podStartE2EDuration="2m44.650258531s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.648045113 +0000 UTC m=+238.637669582" watchObservedRunningTime="2026-02-24 00:09:34.650258531 +0000 UTC m=+238.639883010" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.668661 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" podStartSLOduration=164.668636062 podStartE2EDuration="2m44.668636062s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.663357914 +0000 UTC m=+238.652982383" watchObservedRunningTime="2026-02-24 00:09:34.668636062 +0000 UTC m=+238.658260531" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.696984 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.697394 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.197372934 +0000 UTC m=+239.186997403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.706164 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w6pjq" podStartSLOduration=164.706150334 podStartE2EDuration="2m44.706150334s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.70561713 +0000 UTC m=+238.695241609" watchObservedRunningTime="2026-02-24 00:09:34.706150334 +0000 UTC m=+238.695774803" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.725077 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-c94hb" podStartSLOduration=164.725053699 podStartE2EDuration="2m44.725053699s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.724606618 +0000 UTC m=+238.714231087" watchObservedRunningTime="2026-02-24 00:09:34.725053699 +0000 UTC m=+238.714678168" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.740043 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6h296" podStartSLOduration=164.740018681 podStartE2EDuration="2m44.740018681s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.739351214 +0000 UTC m=+238.728975683" watchObservedRunningTime="2026-02-24 00:09:34.740018681 +0000 UTC m=+238.729643150" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.797170 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" podStartSLOduration=164.797145867 podStartE2EDuration="2m44.797145867s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.796585512 +0000 UTC m=+238.786210001" watchObservedRunningTime="2026-02-24 00:09:34.797145867 +0000 UTC m=+238.786770336" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.799034 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.804010 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.303983346 +0000 UTC m=+239.293608055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.828842 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kh6hg" podStartSLOduration=164.828820306 podStartE2EDuration="2m44.828820306s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.828356944 +0000 UTC m=+238.817981413" watchObservedRunningTime="2026-02-24 00:09:34.828820306 +0000 UTC m=+238.818444775" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.864980 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" podStartSLOduration=164.864954102 podStartE2EDuration="2m44.864954102s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.864659635 +0000 UTC m=+238.854284194" watchObservedRunningTime="2026-02-24 00:09:34.864954102 +0000 UTC m=+238.854578571" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.883220 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tjdhm" podStartSLOduration=164.88318385 podStartE2EDuration="2m44.88318385s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.882748788 +0000 UTC m=+238.872373267" watchObservedRunningTime="2026-02-24 00:09:34.88318385 +0000 UTC m=+238.872808329" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.902615 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:34 crc kubenswrapper[4824]: E0224 00:09:34.903119 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.403097991 +0000 UTC m=+239.392722460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.909814 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vgr74" podStartSLOduration=165.909790896 podStartE2EDuration="2m45.909790896s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.907901247 +0000 UTC m=+238.897525716" watchObservedRunningTime="2026-02-24 00:09:34.909790896 +0000 UTC m=+238.899415365" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.916218 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:34 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:34 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:34 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.916301 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.937948 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zlnwh" podStartSLOduration=164.937928373 podStartE2EDuration="2m44.937928373s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.936474465 +0000 UTC m=+238.926098934" watchObservedRunningTime="2026-02-24 00:09:34.937928373 +0000 UTC m=+238.927552842" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.973718 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4tvd9" podStartSLOduration=10.973696329 podStartE2EDuration="10.973696329s" podCreationTimestamp="2026-02-24 00:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.972398135 +0000 UTC m=+238.962022604" watchObservedRunningTime="2026-02-24 00:09:34.973696329 +0000 UTC m=+238.963320798" Feb 24 00:09:34 crc kubenswrapper[4824]: I0224 00:09:34.997507 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h7djl" podStartSLOduration=164.997480542 podStartE2EDuration="2m44.997480542s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:34.995368687 +0000 UTC m=+238.984993166" watchObservedRunningTime="2026-02-24 00:09:34.997480542 +0000 UTC m=+238.987105021" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.010428 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.010924 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.510899894 +0000 UTC m=+239.500524433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.035839 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" podStartSLOduration=165.035809196 podStartE2EDuration="2m45.035809196s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:35.031718109 +0000 UTC m=+239.021342598" watchObservedRunningTime="2026-02-24 00:09:35.035809196 +0000 UTC m=+239.025433665" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.111867 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.112074 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.612036862 +0000 UTC m=+239.601661331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.112134 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.112538 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.612528695 +0000 UTC m=+239.602153164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.214017 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.214218 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.714172586 +0000 UTC m=+239.703797065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.214425 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.215011 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.714985257 +0000 UTC m=+239.704609906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.236012 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.236906 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.242984 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.243562 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.261466 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.315554 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.315809 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.815762426 +0000 UTC m=+239.805386905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.315875 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.316565 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.816546797 +0000 UTC m=+239.806171266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.371686 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.373538 4824 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9ml5g container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.373614 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" podUID="01ed973e-7ed7-41ec-bea9-69d8c86e19ed" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.373665 4824 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9ml5g container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.373748 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" podUID="01ed973e-7ed7-41ec-bea9-69d8c86e19ed" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.417439 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.417691 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.417724 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.417878 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:35.917861579 +0000 UTC m=+239.907486048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.519548 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.519640 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.519700 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.519756 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.520462 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.020448975 +0000 UTC m=+240.010073444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.643824 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.644306 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.144286268 +0000 UTC m=+240.133910737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.654410 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" event={"ID":"ac257861-33c1-4e92-9d58-bb7351f6316e","Type":"ContainerStarted","Data":"0ed20bbbc63392a3eeae8c7d2fd31b71d7f1c1c0e7cf28ce11803b7eefc35248"} Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.664381 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.666880 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" event={"ID":"54cdfa0a-fdb0-4509-9d56-01194a25ee63","Type":"ContainerStarted","Data":"490ef12a95d77d56bcb6473530e8535785604863f20cce732613ee074d0bbee8"} Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.669666 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5n768" event={"ID":"511cd2d5-0160-44f2-adf1-acbe5c8c28cf","Type":"ContainerStarted","Data":"e980e0e1e456ea4385f34dec4d8a53291d7e56983ca3dd4e91532d20097f7188"} Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.670198 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5n768" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.673820 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" event={"ID":"f66ddecd-538b-48bd-a335-e7f99181daa0","Type":"ContainerStarted","Data":"cdb56e67daad25b19cf0044f1b485f89750f7982e1f6b323aa8a0e2cc9471aaa"} Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.673916 4824 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-9ml5g container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.673955 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" podUID="01ed973e-7ed7-41ec-bea9-69d8c86e19ed" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.697125 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5ltfz" podStartSLOduration=165.69709694 podStartE2EDuration="2m45.69709694s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:35.689709787 +0000 UTC m=+239.679334276" watchObservedRunningTime="2026-02-24 00:09:35.69709694 +0000 UTC m=+239.686721409" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.745774 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.746362 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.246341229 +0000 UTC m=+240.235965698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.756648 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" podStartSLOduration=166.756623498 podStartE2EDuration="2m46.756623498s" podCreationTimestamp="2026-02-24 00:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:35.731195562 +0000 UTC m=+239.720820041" watchObservedRunningTime="2026-02-24 00:09:35.756623498 +0000 UTC m=+239.746247967" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.757324 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5n768" podStartSLOduration=11.757316936 podStartE2EDuration="11.757316936s" podCreationTimestamp="2026-02-24 00:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:35.752939892 +0000 UTC m=+239.742564371" watchObservedRunningTime="2026-02-24 00:09:35.757316936 +0000 UTC m=+239.746941405" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.847674 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.849415 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.349386447 +0000 UTC m=+240.339010916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.855175 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.917814 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:35 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:35 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:35 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.917867 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:35 crc kubenswrapper[4824]: I0224 00:09:35.949478 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:35 crc kubenswrapper[4824]: E0224 00:09:35.950111 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.450080933 +0000 UTC m=+240.439705622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.050775 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.051033 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.550989955 +0000 UTC m=+240.540614424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.051376 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.051807 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.551790756 +0000 UTC m=+240.541415405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.153362 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.153606 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.653573161 +0000 UTC m=+240.643197630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.153943 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.154290 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.65427552 +0000 UTC m=+240.643899989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.255557 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.255971 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.755931641 +0000 UTC m=+240.745556110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.256392 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.256839 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.756829165 +0000 UTC m=+240.746453624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.358354 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.358690 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.858670932 +0000 UTC m=+240.848295391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.370892 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.386338 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.386409 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.386449 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.386534 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.405359 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ghwq4" podStartSLOduration=166.405330053 podStartE2EDuration="2m46.405330053s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:35.788802431 +0000 UTC m=+239.778426900" watchObservedRunningTime="2026-02-24 00:09:36.405330053 +0000 UTC m=+240.394954522" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.459656 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.460063 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:36.960046766 +0000 UTC m=+240.949671235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.509343 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.516407 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.533615 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.565754 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.567548 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.067509079 +0000 UTC m=+241.057133548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.582921 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.582954 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.670597 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.672565 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.17254661 +0000 UTC m=+241.162171079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.734424 4824 csr.go:261] certificate signing request csr-bjltj is approved, waiting to be issued Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.734754 4824 csr.go:257] certificate signing request csr-bjltj is issued Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.773777 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.774773 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.274690624 +0000 UTC m=+241.264315153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.795923 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"671ced26-8fac-4a17-a516-ab23ebcd6945","Type":"ContainerStarted","Data":"dc2fc5cc3054549841a2254f5e01ca95caee472926cb159e358b999488865ad7"} Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.802194 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" event={"ID":"42d75b69-be96-43de-8687-444a81d8ebd5","Type":"ContainerStarted","Data":"338a12386f0317dce096ca7a1165344a983be9677fb91c8d31897c5133dbe942"} Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.877301 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.877676 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.37766363 +0000 UTC m=+241.367288099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.897869 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.907117 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:36 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:36 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:36 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.907591 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.979125 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.979251 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.47923385 +0000 UTC m=+241.468858309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:36 crc kubenswrapper[4824]: I0224 00:09:36.981955 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:36 crc kubenswrapper[4824]: E0224 00:09:36.982470 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.482461484 +0000 UTC m=+241.472085953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.035457 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.036067 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.038329 4824 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xfl22 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.34:8443/livez\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.038381 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" podUID="f66ddecd-538b-48bd-a335-e7f99181daa0" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.34:8443/livez\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.082567 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.082813 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.582775541 +0000 UTC m=+241.572400010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.083093 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.083559 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.583543831 +0000 UTC m=+241.573168300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.107664 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-p5tqf" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.184645 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.184874 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.684840273 +0000 UTC m=+241.674464742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.185062 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.185493 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.68547827 +0000 UTC m=+241.675102739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.186453 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.187822 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.195112 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.195865 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.197265 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rvv9j" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.208546 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.274996 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.282570 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.285922 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.286142 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.286219 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.286493 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.786456404 +0000 UTC m=+241.776080873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.292545 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.299195 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.316562 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.320754 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.334070 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.349628 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.350762 4824 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-jkghx container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]log ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]etcd ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]etcd-readiness ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 24 00:09:37 crc kubenswrapper[4824]: [-]informer-sync failed: reason withheld Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/generic-apiserver-start-informers ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/max-in-flight-filter ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-StartUserInformer ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-StartOAuthInformer ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Feb 24 00:09:37 crc kubenswrapper[4824]: [+]shutdown ok Feb 24 00:09:37 crc kubenswrapper[4824]: readyz check failed Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.350858 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" podUID="836fad19-b7d1-434c-9fd8-faf3eb1d80d1" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.364712 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.364752 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.373934 4824 patch_prober.go:28] interesting pod/console-f9d7485db-zlnwh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.374016 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zlnwh" podUID="5f4f79cd-ada9-4ec7-b779-94d97bdadc97" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.388867 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnlm\" (UniqueName: \"kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.388919 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.388943 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.389010 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.389071 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.389143 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.389538 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.889504522 +0000 UTC m=+241.879128991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.389669 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.470250 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.480930 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.490673 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.491030 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnlm\" (UniqueName: \"kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.491053 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.491203 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.492678 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:37.992659633 +0000 UTC m=+241.982284102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.494213 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.500029 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.504032 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.508233 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.541797 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.567142 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.593580 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.593692 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.593750 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.593780 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbcf\" (UniqueName: \"kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.594114 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.094098839 +0000 UTC m=+242.083723308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.651385 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xs7nb" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.696446 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.697107 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.697139 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbcf\" (UniqueName: \"kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.697195 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.698123 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.198095132 +0000 UTC m=+242.187719601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.701635 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.715576 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.736354 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 00:04:36 +0000 UTC, rotation deadline is 2027-01-12 08:09:59.953671046 +0000 UTC Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.736402 4824 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7736h0m22.217271971s for next certificate rotation Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.801544 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.802215 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.302194527 +0000 UTC m=+242.291818996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.849833 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"671ced26-8fac-4a17-a516-ab23ebcd6945","Type":"ContainerStarted","Data":"c1a2c2c662a6d26367b757b336ffe84f28d97409d74d7d15771e02012f07ca5b"} Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.861607 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.872574 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.874635 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.879348 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbcf\" (UniqueName: \"kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf\") pod \"community-operators-hhftg\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.883505 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnlm\" (UniqueName: \"kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm\") pod \"certified-operators-dmjz7\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.895608 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.899383 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.905590 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:37 crc kubenswrapper[4824]: E0224 00:09:37.906161 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.406135509 +0000 UTC m=+242.395759978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.917501 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:37 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:37 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:37 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.917579 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.919308 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.921777 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" event={"ID":"42d75b69-be96-43de-8687-444a81d8ebd5","Type":"ContainerStarted","Data":"a741aa07fb76f7ffe35f231ee36df68a7e9c2e0f768faf293f22c13513da4d44"} Feb 24 00:09:37 crc kubenswrapper[4824]: I0224 00:09:37.921930 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.005225 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024216 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024271 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024310 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94kl\" (UniqueName: \"kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024369 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024405 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024442 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.024499 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t895b\" (UniqueName: \"kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.028006 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.527970329 +0000 UTC m=+242.517594798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.125796 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126700 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126789 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t895b\" (UniqueName: \"kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126888 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126931 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126957 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94kl\" (UniqueName: \"kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.126978 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.131085 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.131997 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.631973592 +0000 UTC m=+242.621598051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.136097 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.136391 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.148274 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.148900 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.202593 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94kl\" (UniqueName: \"kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl\") pod \"certified-operators-6kgrd\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.215710 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t895b\" (UniqueName: \"kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b\") pod \"community-operators-bfhcg\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.234883 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.235272 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.735259916 +0000 UTC m=+242.724884375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.256233 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-d95c9" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.271411 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.310629 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.310592179 podStartE2EDuration="3.310592179s" podCreationTimestamp="2026-02-24 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:38.271224258 +0000 UTC m=+242.260848727" watchObservedRunningTime="2026-02-24 00:09:38.310592179 +0000 UTC m=+242.300216658" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.336080 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.343995 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.843944142 +0000 UTC m=+242.833568611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.358162 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.358727 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.858711719 +0000 UTC m=+242.848336188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.369125 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.464075 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9ml5g" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.466510 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.467369 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:38.967347163 +0000 UTC m=+242.956971642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.559959 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.560215 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" containerID="cri-o://5b26b0b907a714e0eb8fa1b65c28de4396d3dfbda124fc0f95f4d779730bf39c" gracePeriod=30 Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.569660 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.570278 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:39.070258428 +0000 UTC m=+243.059882887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.650737 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.651074 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" containerID="cri-o://b81e27c4f382da70441b9aeabc639347240cb4f01fe8b81ba75c60688c944968" gracePeriod=30 Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.670755 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.671393 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:39.171377625 +0000 UTC m=+243.161002094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.687210 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.748716 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.773886 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.774358 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:39.274342671 +0000 UTC m=+243.263967130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.877364 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.877794 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 00:09:39.377770199 +0000 UTC m=+243.367394668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.889581 4824 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.900234 4824 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-24T00:09:38.889611139Z","Handler":null,"Name":""} Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.912079 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:38 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:38 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:38 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.912183 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.969142 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"393dd5ac-a813-412e-ac2d-1d654d3e5c64","Type":"ContainerStarted","Data":"7c1c92983cde9f50ed89292c9ba489e6609f2698fffaab5523fc26a4c6ca4f45"} Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.981288 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:38 crc kubenswrapper[4824]: E0224 00:09:38.981707 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 00:09:39.48169048 +0000 UTC m=+243.471314949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ccm27" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.994266 4824 generic.go:334] "Generic (PLEG): container finished" podID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerID="b81e27c4f382da70441b9aeabc639347240cb4f01fe8b81ba75c60688c944968" exitCode=0 Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.994407 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" event={"ID":"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b","Type":"ContainerDied","Data":"b81e27c4f382da70441b9aeabc639347240cb4f01fe8b81ba75c60688c944968"} Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.994684 4824 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 24 00:09:38 crc kubenswrapper[4824]: I0224 00:09:38.994735 4824 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.011659 4824 generic.go:334] "Generic (PLEG): container finished" podID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerID="5b26b0b907a714e0eb8fa1b65c28de4396d3dfbda124fc0f95f4d779730bf39c" exitCode=0 Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.011804 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" event={"ID":"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115","Type":"ContainerDied","Data":"5b26b0b907a714e0eb8fa1b65c28de4396d3dfbda124fc0f95f4d779730bf39c"} Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.016627 4824 generic.go:334] "Generic (PLEG): container finished" podID="671ced26-8fac-4a17-a516-ab23ebcd6945" containerID="c1a2c2c662a6d26367b757b336ffe84f28d97409d74d7d15771e02012f07ca5b" exitCode=0 Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.017060 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"671ced26-8fac-4a17-a516-ab23ebcd6945","Type":"ContainerDied","Data":"c1a2c2c662a6d26367b757b336ffe84f28d97409d74d7d15771e02012f07ca5b"} Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.049062 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" event={"ID":"42d75b69-be96-43de-8687-444a81d8ebd5","Type":"ContainerStarted","Data":"5cc948d744be03102a61342e2f49ec08158b5ed47625f2fd4288c19c07ae5798"} Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.050794 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerStarted","Data":"42b124ba705dc951f666837537c3a14e76c91f608879722e252c98578703a4ac"} Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.089945 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.172260 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.198370 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.199013 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.207290 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.207436 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.211552 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.211755 4824 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.211810 4824 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.313611 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ccm27\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.362669 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.417722 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggqz\" (UniqueName: \"kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.418009 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.418057 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.429657 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.445904 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.467372 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:09:39 crc kubenswrapper[4824]: W0224 00:09:39.474227 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00860ed_9085_40bb_9041_16eac6d88fb1.slice/crio-9c018af5a403cd93073513dceb16d5cd69816c1726faff3f2cab188f2753d464 WatchSource:0}: Error finding container 9c018af5a403cd93073513dceb16d5cd69816c1726faff3f2cab188f2753d464: Status 404 returned error can't find the container with id 9c018af5a403cd93073513dceb16d5cd69816c1726faff3f2cab188f2753d464 Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.490670 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.518904 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config\") pod \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.518969 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert\") pod \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519000 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config\") pod \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519048 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca\") pod \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519116 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert\") pod \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519203 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjj7\" (UniqueName: \"kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7\") pod \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519245 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5js\" (UniqueName: \"kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js\") pod \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\" (UID: \"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519271 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles\") pod \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519314 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca\") pod \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\" (UID: \"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115\") " Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519564 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519597 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.519689 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hggqz\" (UniqueName: \"kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.520934 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.521349 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config" (OuterVolumeSpecName: "config") pod "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" (UID: "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.521561 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.522546 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config" (OuterVolumeSpecName: "config") pod "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" (UID: "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.523074 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca" (OuterVolumeSpecName: "client-ca") pod "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" (UID: "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.523421 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" (UID: "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.528012 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" (UID: "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.533901 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7" (OuterVolumeSpecName: "kube-api-access-sqjj7") pod "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" (UID: "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115"). InnerVolumeSpecName "kube-api-access-sqjj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.534074 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" (UID: "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.538781 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.538977 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" (UID: "49f90ffa-baa8-4d9b-bd17-3a12f9c6f115"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.542075 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hggqz\" (UniqueName: \"kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz\") pod \"redhat-marketplace-nzqwf\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.544934 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.546776 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js" (OuterVolumeSpecName: "kube-api-access-nw5js") pod "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" (UID: "1c4e1d48-7f8d-44b6-97b3-3ceccb35385b"). InnerVolumeSpecName "kube-api-access-nw5js". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.590709 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:09:39 crc kubenswrapper[4824]: E0224 00:09:39.591314 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.591343 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: E0224 00:09:39.591362 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.591370 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.591493 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" containerName="route-controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.591505 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" containerName="controller-manager" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.596736 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.607683 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630307 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630373 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630479 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4qw\" (UniqueName: \"kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630712 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630725 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630737 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjj7\" (UniqueName: \"kubernetes.io/projected/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-kube-api-access-sqjj7\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630749 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5js\" (UniqueName: \"kubernetes.io/projected/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-kube-api-access-nw5js\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630759 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630769 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630778 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630787 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.630796 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.663963 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.732147 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4qw\" (UniqueName: \"kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.732233 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.732269 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.741338 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.742806 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.762051 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4qw\" (UniqueName: \"kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw\") pod \"redhat-marketplace-mfzkw\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.904408 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:39 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:39 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:39 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.904482 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.940686 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.945859 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b64d957-q2tx6"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.947029 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.951358 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.952536 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.961917 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb"] Feb 24 00:09:39 crc kubenswrapper[4824]: I0224 00:09:39.966580 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b64d957-q2tx6"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.023882 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048075 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048119 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048152 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048213 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048242 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w667c\" (UniqueName: \"kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048271 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zpn\" (UniqueName: \"kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048526 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048626 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.048695 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: W0224 00:09:40.051756 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9016587d_3cd5_46d7_bd50_586cd32933f7.slice/crio-7258e3c460d9eb30e7b444c92e1cb2427c103a3e9b4014b73c4a4fe6cecde128 WatchSource:0}: Error finding container 7258e3c460d9eb30e7b444c92e1cb2427c103a3e9b4014b73c4a4fe6cecde128: Status 404 returned error can't find the container with id 7258e3c460d9eb30e7b444c92e1cb2427c103a3e9b4014b73c4a4fe6cecde128 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.058920 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" event={"ID":"9016587d-3cd5-46d7-bd50-586cd32933f7","Type":"ContainerStarted","Data":"7258e3c460d9eb30e7b444c92e1cb2427c103a3e9b4014b73c4a4fe6cecde128"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.068275 4824 generic.go:334] "Generic (PLEG): container finished" podID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerID="8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475" exitCode=0 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.068360 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerDied","Data":"8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.068390 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerStarted","Data":"a33e62fe6f2549eb1208d3cf356835348bf6325f74507046a34d8f566aaa9f3c"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.075088 4824 generic.go:334] "Generic (PLEG): container finished" podID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerID="165f557a643df29a5f3055b0f6055d2350a6f07b3c59175faba79784672bcb83" exitCode=0 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.075150 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerDied","Data":"165f557a643df29a5f3055b0f6055d2350a6f07b3c59175faba79784672bcb83"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.075174 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerStarted","Data":"24c650baf1648fdbc140def26b06acbc896c72aa2095332a4a2cc286bdf3cc0c"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.076566 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"393dd5ac-a813-412e-ac2d-1d654d3e5c64","Type":"ContainerStarted","Data":"49c784ba799f271351dfe9e2df364e535f8e5fabd915dec0cc780fab30b4c4e0"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.083829 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" event={"ID":"1c4e1d48-7f8d-44b6-97b3-3ceccb35385b","Type":"ContainerDied","Data":"d168e7b48bb45e2f3eeaabaaae34172927392199794b2e25a747dcf303c33d18"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.083878 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.083892 4824 scope.go:117] "RemoveContainer" containerID="b81e27c4f382da70441b9aeabc639347240cb4f01fe8b81ba75c60688c944968" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.088655 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" event={"ID":"49f90ffa-baa8-4d9b-bd17-3a12f9c6f115","Type":"ContainerDied","Data":"3b88c71c7f646381790daa9790f722b3637990f46735a62cbc3312b308a3ab9b"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.088814 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jm7qk" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.115242 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb"] Feb 24 00:09:40 crc kubenswrapper[4824]: E0224 00:09:40.116011 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-w667c serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" podUID="2e5605b6-71ca-4b14-9feb-c2036ed86648" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.117349 4824 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.119369 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b64d957-q2tx6"] Feb 24 00:09:40 crc kubenswrapper[4824]: E0224 00:09:40.124063 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-h6zpn proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" podUID="963d91ec-628d-4269-bfc9-2c6ffb4845b9" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151399 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151442 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151473 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151510 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151541 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151568 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151625 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151652 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w667c\" (UniqueName: \"kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.151676 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zpn\" (UniqueName: \"kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.154733 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.155619 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.155652 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.156704 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.156790 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.158109 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" event={"ID":"42d75b69-be96-43de-8687-444a81d8ebd5","Type":"ContainerStarted","Data":"3f6ee22c9b5dca052da8f48e0e4d619b636f0d5e852c1428d9ec153607ed60b3"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.170446 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.170999 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.184731 4824 generic.go:334] "Generic (PLEG): container finished" podID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerID="0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc" exitCode=0 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.185079 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerDied","Data":"0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.189987 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w667c\" (UniqueName: \"kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c\") pod \"route-controller-manager-57cf5fdf8c-kzjzb\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.214879 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.215612 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zpn\" (UniqueName: \"kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn\") pod \"controller-manager-7b64d957-q2tx6\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.215857 4824 scope.go:117] "RemoveContainer" containerID="5b26b0b907a714e0eb8fa1b65c28de4396d3dfbda124fc0f95f4d779730bf39c" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.216221 4824 generic.go:334] "Generic (PLEG): container finished" podID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerID="b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba" exitCode=0 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.225718 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerDied","Data":"b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.225793 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerStarted","Data":"9c018af5a403cd93073513dceb16d5cd69816c1726faff3f2cab188f2753d464"} Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.246916 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.246892386 podStartE2EDuration="3.246892386s" podCreationTimestamp="2026-02-24 00:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:40.224287274 +0000 UTC m=+244.213911763" watchObservedRunningTime="2026-02-24 00:09:40.246892386 +0000 UTC m=+244.236516855" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.284463 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dq9gz" podStartSLOduration=16.284442229 podStartE2EDuration="16.284442229s" podCreationTimestamp="2026-02-24 00:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:40.282533369 +0000 UTC m=+244.272157838" watchObservedRunningTime="2026-02-24 00:09:40.284442229 +0000 UTC m=+244.274066698" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.423720 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.425900 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.437232 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.442883 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.459477 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jm7qk"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.465960 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.466209 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.466297 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc6ds\" (UniqueName: \"kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.466339 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.470203 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.474352 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9k27r"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.568355 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc6ds\" (UniqueName: \"kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.568427 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.569129 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.570983 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.571769 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.573504 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:09:40 crc kubenswrapper[4824]: W0224 00:09:40.588444 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08de7fe0_2d54_408b_8e09_3e1b9bcf931a.slice/crio-eef69387238650e8470f3abae2d3e6234452b8da7d8847435def291fdad9a1d8 WatchSource:0}: Error finding container eef69387238650e8470f3abae2d3e6234452b8da7d8847435def291fdad9a1d8: Status 404 returned error can't find the container with id eef69387238650e8470f3abae2d3e6234452b8da7d8847435def291fdad9a1d8 Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.594887 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc6ds\" (UniqueName: \"kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds\") pod \"redhat-operators-zxplg\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.688777 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.718108 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4e1d48-7f8d-44b6-97b3-3ceccb35385b" path="/var/lib/kubelet/pods/1c4e1d48-7f8d-44b6-97b3-3ceccb35385b/volumes" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.718687 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f90ffa-baa8-4d9b-bd17-3a12f9c6f115" path="/var/lib/kubelet/pods/49f90ffa-baa8-4d9b-bd17-3a12f9c6f115/volumes" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.719547 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.782509 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:09:40 crc kubenswrapper[4824]: E0224 00:09:40.782954 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ced26-8fac-4a17-a516-ab23ebcd6945" containerName="pruner" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.782973 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ced26-8fac-4a17-a516-ab23ebcd6945" containerName="pruner" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.783167 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ced26-8fac-4a17-a516-ab23ebcd6945" containerName="pruner" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.784557 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.788305 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.802635 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.875427 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access\") pod \"671ced26-8fac-4a17-a516-ab23ebcd6945\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.875572 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir\") pod \"671ced26-8fac-4a17-a516-ab23ebcd6945\" (UID: \"671ced26-8fac-4a17-a516-ab23ebcd6945\") " Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.876037 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "671ced26-8fac-4a17-a516-ab23ebcd6945" (UID: "671ced26-8fac-4a17-a516-ab23ebcd6945"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.878599 4824 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/671ced26-8fac-4a17-a516-ab23ebcd6945-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.890924 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "671ced26-8fac-4a17-a516-ab23ebcd6945" (UID: "671ced26-8fac-4a17-a516-ab23ebcd6945"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.905372 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:40 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:40 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:40 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.905483 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.982304 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhvh\" (UniqueName: \"kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.983067 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.983155 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:40 crc kubenswrapper[4824]: I0224 00:09:40.983293 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/671ced26-8fac-4a17-a516-ab23ebcd6945-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.086384 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.086548 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhvh\" (UniqueName: \"kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.086640 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.087365 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.087365 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.134218 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhvh\" (UniqueName: \"kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh\") pod \"redhat-operators-gl27t\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.200136 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.257221 4824 generic.go:334] "Generic (PLEG): container finished" podID="b142d96b-87c3-444b-b135-fdddaa658234" containerID="5af3da4115b49b00d3bb13283e7fccd617f9a8fbd1e5c6782e319a1b0a15e513" exitCode=0 Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.257395 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerDied","Data":"5af3da4115b49b00d3bb13283e7fccd617f9a8fbd1e5c6782e319a1b0a15e513"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.257442 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerStarted","Data":"6087d7cb108c4772f5476645e00887a465effb8e262d89a746313bbbb9fb34f8"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.282098 4824 generic.go:334] "Generic (PLEG): container finished" podID="239fc97c-cb5a-4fa1-965e-7b64c90268ce" containerID="594ea7953af708dc6eec520d0cd46b08f1c6126425d4ad263d064dfe050100f2" exitCode=0 Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.282312 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" event={"ID":"239fc97c-cb5a-4fa1-965e-7b64c90268ce","Type":"ContainerDied","Data":"594ea7953af708dc6eec520d0cd46b08f1c6126425d4ad263d064dfe050100f2"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.292858 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.292903 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"671ced26-8fac-4a17-a516-ab23ebcd6945","Type":"ContainerDied","Data":"dc2fc5cc3054549841a2254f5e01ca95caee472926cb159e358b999488865ad7"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.292987 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc2fc5cc3054549841a2254f5e01ca95caee472926cb159e358b999488865ad7" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.299323 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" event={"ID":"9016587d-3cd5-46d7-bd50-586cd32933f7","Type":"ContainerStarted","Data":"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.299791 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.302793 4824 generic.go:334] "Generic (PLEG): container finished" podID="393dd5ac-a813-412e-ac2d-1d654d3e5c64" containerID="49c784ba799f271351dfe9e2df364e535f8e5fabd915dec0cc780fab30b4c4e0" exitCode=0 Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.302876 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"393dd5ac-a813-412e-ac2d-1d654d3e5c64","Type":"ContainerDied","Data":"49c784ba799f271351dfe9e2df364e535f8e5fabd915dec0cc780fab30b4c4e0"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.306819 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerDied","Data":"095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.306478 4824 generic.go:334] "Generic (PLEG): container finished" podID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerID="095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533" exitCode=0 Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.307356 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.307420 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerStarted","Data":"eef69387238650e8470f3abae2d3e6234452b8da7d8847435def291fdad9a1d8"} Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.307650 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.335780 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.337212 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.382891 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" podStartSLOduration=171.382868509 podStartE2EDuration="2m51.382868509s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:41.379955563 +0000 UTC m=+245.369580052" watchObservedRunningTime="2026-02-24 00:09:41.382868509 +0000 UTC m=+245.372492988" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406296 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles\") pod \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406358 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config\") pod \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406395 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca\") pod \"2e5605b6-71ca-4b14-9feb-c2036ed86648\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406425 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca\") pod \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406485 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w667c\" (UniqueName: \"kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c\") pod \"2e5605b6-71ca-4b14-9feb-c2036ed86648\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.406531 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config\") pod \"2e5605b6-71ca-4b14-9feb-c2036ed86648\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.407475 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e5605b6-71ca-4b14-9feb-c2036ed86648" (UID: "2e5605b6-71ca-4b14-9feb-c2036ed86648"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.407696 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.409984 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "963d91ec-628d-4269-bfc9-2c6ffb4845b9" (UID: "963d91ec-628d-4269-bfc9-2c6ffb4845b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.413835 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.414008 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "963d91ec-628d-4269-bfc9-2c6ffb4845b9" (UID: "963d91ec-628d-4269-bfc9-2c6ffb4845b9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.414267 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config" (OuterVolumeSpecName: "config") pod "963d91ec-628d-4269-bfc9-2c6ffb4845b9" (UID: "963d91ec-628d-4269-bfc9-2c6ffb4845b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.424088 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config" (OuterVolumeSpecName: "config") pod "2e5605b6-71ca-4b14-9feb-c2036ed86648" (UID: "2e5605b6-71ca-4b14-9feb-c2036ed86648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.443060 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c" (OuterVolumeSpecName: "kube-api-access-w667c") pod "2e5605b6-71ca-4b14-9feb-c2036ed86648" (UID: "2e5605b6-71ca-4b14-9feb-c2036ed86648"). InnerVolumeSpecName "kube-api-access-w667c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.508445 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert\") pod \"2e5605b6-71ca-4b14-9feb-c2036ed86648\" (UID: \"2e5605b6-71ca-4b14-9feb-c2036ed86648\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509093 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert\") pod \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509176 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6zpn\" (UniqueName: \"kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn\") pod \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\" (UID: \"963d91ec-628d-4269-bfc9-2c6ffb4845b9\") " Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509390 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509403 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509413 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/963d91ec-628d-4269-bfc9-2c6ffb4845b9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509422 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w667c\" (UniqueName: \"kubernetes.io/projected/2e5605b6-71ca-4b14-9feb-c2036ed86648-kube-api-access-w667c\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.509433 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5605b6-71ca-4b14-9feb-c2036ed86648-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.520083 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e5605b6-71ca-4b14-9feb-c2036ed86648" (UID: "2e5605b6-71ca-4b14-9feb-c2036ed86648"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.525436 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn" (OuterVolumeSpecName: "kube-api-access-h6zpn") pod "963d91ec-628d-4269-bfc9-2c6ffb4845b9" (UID: "963d91ec-628d-4269-bfc9-2c6ffb4845b9"). InnerVolumeSpecName "kube-api-access-h6zpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.533286 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "963d91ec-628d-4269-bfc9-2c6ffb4845b9" (UID: "963d91ec-628d-4269-bfc9-2c6ffb4845b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.587023 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jkghx" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.610711 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5605b6-71ca-4b14-9feb-c2036ed86648-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.610738 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/963d91ec-628d-4269-bfc9-2c6ffb4845b9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.610749 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6zpn\" (UniqueName: \"kubernetes.io/projected/963d91ec-628d-4269-bfc9-2c6ffb4845b9-kube-api-access-h6zpn\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.812643 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.902764 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:41 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:41 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:41 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:41 crc kubenswrapper[4824]: I0224 00:09:41.903103 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.044386 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.049352 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xfl22" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.384407 4824 generic.go:334] "Generic (PLEG): container finished" podID="2da73289-3f96-4828-a106-46c3b0469e7d" containerID="92570d872625fe189d1225ae3cfcceb0efc1931cef5c4ee603139bb405c9eff3" exitCode=0 Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.384850 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerDied","Data":"92570d872625fe189d1225ae3cfcceb0efc1931cef5c4ee603139bb405c9eff3"} Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.384921 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerStarted","Data":"2503a134b22274bc6e70e9fb4c998a82c8a291a8ce5041c5a448cbf0b7c362a7"} Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.387680 4824 generic.go:334] "Generic (PLEG): container finished" podID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerID="05173dd075227354e5c8172cf583a8c34fd894215338d07f6c1a9644348f85b0" exitCode=0 Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.387985 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerDied","Data":"05173dd075227354e5c8172cf583a8c34fd894215338d07f6c1a9644348f85b0"} Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.388015 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b64d957-q2tx6" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.388049 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerStarted","Data":"6ef8798cf5f3aadb98a5ae1d2d3bf34bb35cf168ac8076ee6ba9bc741a06b98b"} Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.388457 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.431655 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5n768" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.467810 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b64d957-q2tx6"] Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.470047 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b64d957-q2tx6"] Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.514240 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb"] Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.535338 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57cf5fdf8c-kzjzb"] Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.749440 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5605b6-71ca-4b14-9feb-c2036ed86648" path="/var/lib/kubelet/pods/2e5605b6-71ca-4b14-9feb-c2036ed86648/volumes" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.750123 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963d91ec-628d-4269-bfc9-2c6ffb4845b9" path="/var/lib/kubelet/pods/963d91ec-628d-4269-bfc9-2c6ffb4845b9/volumes" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.907713 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:42 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:42 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:42 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.907782 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.953842 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:42 crc kubenswrapper[4824]: I0224 00:09:42.971961 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056072 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir\") pod \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056153 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glzv\" (UniqueName: \"kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv\") pod \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056212 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "393dd5ac-a813-412e-ac2d-1d654d3e5c64" (UID: "393dd5ac-a813-412e-ac2d-1d654d3e5c64"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056261 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume\") pod \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056285 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access\") pod \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\" (UID: \"393dd5ac-a813-412e-ac2d-1d654d3e5c64\") " Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056327 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume\") pod \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\" (UID: \"239fc97c-cb5a-4fa1-965e-7b64c90268ce\") " Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.056606 4824 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.058553 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "239fc97c-cb5a-4fa1-965e-7b64c90268ce" (UID: "239fc97c-cb5a-4fa1-965e-7b64c90268ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.080753 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "393dd5ac-a813-412e-ac2d-1d654d3e5c64" (UID: "393dd5ac-a813-412e-ac2d-1d654d3e5c64"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.083126 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "239fc97c-cb5a-4fa1-965e-7b64c90268ce" (UID: "239fc97c-cb5a-4fa1-965e-7b64c90268ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.083412 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv" (OuterVolumeSpecName: "kube-api-access-9glzv") pod "239fc97c-cb5a-4fa1-965e-7b64c90268ce" (UID: "239fc97c-cb5a-4fa1-965e-7b64c90268ce"). InnerVolumeSpecName "kube-api-access-9glzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.157959 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glzv\" (UniqueName: \"kubernetes.io/projected/239fc97c-cb5a-4fa1-965e-7b64c90268ce-kube-api-access-9glzv\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.158012 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/393dd5ac-a813-412e-ac2d-1d654d3e5c64-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.158027 4824 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239fc97c-cb5a-4fa1-965e-7b64c90268ce-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.158041 4824 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239fc97c-cb5a-4fa1-965e-7b64c90268ce-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.398139 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"393dd5ac-a813-412e-ac2d-1d654d3e5c64","Type":"ContainerDied","Data":"7c1c92983cde9f50ed89292c9ba489e6609f2698fffaab5523fc26a4c6ca4f45"} Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.398202 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c1c92983cde9f50ed89292c9ba489e6609f2698fffaab5523fc26a4c6ca4f45" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.398281 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.409331 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" event={"ID":"239fc97c-cb5a-4fa1-965e-7b64c90268ce","Type":"ContainerDied","Data":"4d0da2c3da00c6dd6cf100ba43dd4048f42c65ed90df04df6d04e96db17f2c53"} Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.409382 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0da2c3da00c6dd6cf100ba43dd4048f42c65ed90df04df6d04e96db17f2c53" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.409437 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531520-xxvzq" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.912819 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:43 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:43 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:43 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.913256 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.940684 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:09:43 crc kubenswrapper[4824]: E0224 00:09:43.940911 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393dd5ac-a813-412e-ac2d-1d654d3e5c64" containerName="pruner" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.940926 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="393dd5ac-a813-412e-ac2d-1d654d3e5c64" containerName="pruner" Feb 24 00:09:43 crc kubenswrapper[4824]: E0224 00:09:43.940949 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239fc97c-cb5a-4fa1-965e-7b64c90268ce" containerName="collect-profiles" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.940958 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="239fc97c-cb5a-4fa1-965e-7b64c90268ce" containerName="collect-profiles" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.941101 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="393dd5ac-a813-412e-ac2d-1d654d3e5c64" containerName="pruner" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.941122 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="239fc97c-cb5a-4fa1-965e-7b64c90268ce" containerName="collect-profiles" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.941539 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.950861 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.951339 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.959494 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.960783 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.960982 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.967637 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.967778 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.968089 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.968612 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.975417 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.975652 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.975891 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.976042 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.976081 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.980093 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986004 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986059 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986086 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986110 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8k4h\" (UniqueName: \"kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986139 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986178 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64rs6\" (UniqueName: \"kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986208 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986224 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.986244 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.988109 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 00:09:43 crc kubenswrapper[4824]: I0224 00:09:43.998348 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089758 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64rs6\" (UniqueName: \"kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089827 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089852 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089878 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089942 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.089978 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.090003 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.090035 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8k4h\" (UniqueName: \"kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.090070 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.091643 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.091791 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.091872 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.092013 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.101020 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.111402 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64rs6\" (UniqueName: \"kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.120418 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert\") pod \"route-controller-manager-7968fcb589-gvmfw\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.120434 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.143623 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8k4h\" (UniqueName: \"kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h\") pod \"controller-manager-599d8ff48-qktrf\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.272983 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.296998 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.612853 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.616209 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.630729 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a648113f-3e46-4170-ba30-7155fefbb413-metrics-certs\") pod \"network-metrics-daemon-98z42\" (UID: \"a648113f-3e46-4170-ba30-7155fefbb413\") " pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:44 crc kubenswrapper[4824]: W0224 00:09:44.741367 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ca581b_1f92_4494_ab07_3c56396e862c.slice/crio-3348d9f9c7ede94db1fa63c997fae65b1355a60f270f2e59b5e41ad0ae9f9767 WatchSource:0}: Error finding container 3348d9f9c7ede94db1fa63c997fae65b1355a60f270f2e59b5e41ad0ae9f9767: Status 404 returned error can't find the container with id 3348d9f9c7ede94db1fa63c997fae65b1355a60f270f2e59b5e41ad0ae9f9767 Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.774021 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.838937 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.847011 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-98z42" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.901860 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:44 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:44 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:44 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.901953 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:44 crc kubenswrapper[4824]: I0224 00:09:44.990717 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.326913 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-98z42"] Feb 24 00:09:45 crc kubenswrapper[4824]: W0224 00:09:45.413916 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda648113f_3e46_4170_ba30_7155fefbb413.slice/crio-d290ba414cf75b41454b8c91c765b6bdfd8b8c8bbfe011e5cff0544b7a44506e WatchSource:0}: Error finding container d290ba414cf75b41454b8c91c765b6bdfd8b8c8bbfe011e5cff0544b7a44506e: Status 404 returned error can't find the container with id d290ba414cf75b41454b8c91c765b6bdfd8b8c8bbfe011e5cff0544b7a44506e Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.442075 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z42" event={"ID":"a648113f-3e46-4170-ba30-7155fefbb413","Type":"ContainerStarted","Data":"d290ba414cf75b41454b8c91c765b6bdfd8b8c8bbfe011e5cff0544b7a44506e"} Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.457509 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" event={"ID":"c9ca581b-1f92-4494-ab07-3c56396e862c","Type":"ContainerStarted","Data":"61152d725e3742568d9637a36b75cafdf6279903aa2b72696fb965f86fc0262a"} Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.457643 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" event={"ID":"c9ca581b-1f92-4494-ab07-3c56396e862c","Type":"ContainerStarted","Data":"3348d9f9c7ede94db1fa63c997fae65b1355a60f270f2e59b5e41ad0ae9f9767"} Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.459029 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.473733 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" event={"ID":"921fd719-248a-40f2-901e-de82a8c6b9bc","Type":"ContainerStarted","Data":"ebe73061062b5fb6815834caff328cf47e5bdaff5958c1bf3a4054a250e190c9"} Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.475638 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.559926 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" podStartSLOduration=5.559898744 podStartE2EDuration="5.559898744s" podCreationTimestamp="2026-02-24 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:45.516332323 +0000 UTC m=+249.505956792" watchObservedRunningTime="2026-02-24 00:09:45.559898744 +0000 UTC m=+249.549523223" Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.906174 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:45 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:45 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:45 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:45 crc kubenswrapper[4824]: I0224 00:09:45.906613 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.383188 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.383236 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.383300 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.383379 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.531459 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" event={"ID":"921fd719-248a-40f2-901e-de82a8c6b9bc","Type":"ContainerStarted","Data":"d35da3b92a34ebadb664ea295a072909420d8af61a251763865432159776fa7d"} Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.533423 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.568253 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z42" event={"ID":"a648113f-3e46-4170-ba30-7155fefbb413","Type":"ContainerStarted","Data":"f92de97b983ee28ba4c6e2b7ff56d526cf8e45c43a74482e1c082c075f6792a2"} Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.582490 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" podStartSLOduration=6.582468787 podStartE2EDuration="6.582468787s" podCreationTimestamp="2026-02-24 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:46.582234701 +0000 UTC m=+250.571859180" watchObservedRunningTime="2026-02-24 00:09:46.582468787 +0000 UTC m=+250.572093256" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.660621 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.908875 4824 patch_prober.go:28] interesting pod/router-default-5444994796-fp4wq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 00:09:46 crc kubenswrapper[4824]: [-]has-synced failed: reason withheld Feb 24 00:09:46 crc kubenswrapper[4824]: [+]process-running ok Feb 24 00:09:46 crc kubenswrapper[4824]: healthz check failed Feb 24 00:09:46 crc kubenswrapper[4824]: I0224 00:09:46.909270 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fp4wq" podUID="5b0ff99f-1e04-4e23-895a-a02a303c8daa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 00:09:47 crc kubenswrapper[4824]: I0224 00:09:47.367462 4824 patch_prober.go:28] interesting pod/console-f9d7485db-zlnwh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 24 00:09:47 crc kubenswrapper[4824]: I0224 00:09:47.367547 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zlnwh" podUID="5f4f79cd-ada9-4ec7-b779-94d97bdadc97" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 24 00:09:47 crc kubenswrapper[4824]: I0224 00:09:47.634460 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-98z42" event={"ID":"a648113f-3e46-4170-ba30-7155fefbb413","Type":"ContainerStarted","Data":"e8529f608c6974ce2db44f586758015db9551abecd7edd722b8e5fa02da03cae"} Feb 24 00:09:47 crc kubenswrapper[4824]: I0224 00:09:47.900854 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:47 crc kubenswrapper[4824]: I0224 00:09:47.907640 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fp4wq" Feb 24 00:09:53 crc kubenswrapper[4824]: I0224 00:09:53.276240 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:09:53 crc kubenswrapper[4824]: I0224 00:09:53.277026 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.383068 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.383579 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.383634 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.383248 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.384005 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.384330 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.384377 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"deb3616bdfcc08678302c0e0617b53f7bdd5f57fee7e5facc2929f3b91c7322b"} pod="openshift-console/downloads-7954f5f757-r4c4b" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.384422 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" containerID="cri-o://deb3616bdfcc08678302c0e0617b53f7bdd5f57fee7e5facc2929f3b91c7322b" gracePeriod=2 Feb 24 00:09:56 crc kubenswrapper[4824]: I0224 00:09:56.384405 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:09:57 crc kubenswrapper[4824]: I0224 00:09:57.370038 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:57 crc kubenswrapper[4824]: I0224 00:09:57.374559 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zlnwh" Feb 24 00:09:57 crc kubenswrapper[4824]: I0224 00:09:57.391546 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-98z42" podStartSLOduration=187.391507676 podStartE2EDuration="3m7.391507676s" podCreationTimestamp="2026-02-24 00:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:09:48.679889954 +0000 UTC m=+252.669514443" watchObservedRunningTime="2026-02-24 00:09:57.391507676 +0000 UTC m=+261.381132145" Feb 24 00:09:59 crc kubenswrapper[4824]: I0224 00:09:59.552867 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:09:59 crc kubenswrapper[4824]: I0224 00:09:59.848694 4824 generic.go:334] "Generic (PLEG): container finished" podID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerID="deb3616bdfcc08678302c0e0617b53f7bdd5f57fee7e5facc2929f3b91c7322b" exitCode=0 Feb 24 00:09:59 crc kubenswrapper[4824]: I0224 00:09:59.848771 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r4c4b" event={"ID":"581e69ae-c21a-4a9e-b1ea-9c38256d7b30","Type":"ContainerDied","Data":"deb3616bdfcc08678302c0e0617b53f7bdd5f57fee7e5facc2929f3b91c7322b"} Feb 24 00:10:06 crc kubenswrapper[4824]: I0224 00:10:06.384120 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:06 crc kubenswrapper[4824]: I0224 00:10:06.384535 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:07 crc kubenswrapper[4824]: I0224 00:10:07.352014 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2jqvq" Feb 24 00:10:13 crc kubenswrapper[4824]: I0224 00:10:13.929246 4824 generic.go:334] "Generic (PLEG): container finished" podID="f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" containerID="6b2ac39d85326d80c4e57096bd6873f9064eac38a27f5eddf04bd260901e4edf" exitCode=0 Feb 24 00:10:13 crc kubenswrapper[4824]: I0224 00:10:13.929329 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29531520-969xh" event={"ID":"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b","Type":"ContainerDied","Data":"6b2ac39d85326d80c4e57096bd6873f9064eac38a27f5eddf04bd260901e4edf"} Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.755124 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.756544 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.759704 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.760078 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.771364 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.904743 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:15 crc kubenswrapper[4824]: I0224 00:10:15.904811 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.007208 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.007320 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.007381 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.026068 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.124109 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.385236 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.385309 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.638619 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.716421 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc987\" (UniqueName: \"kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987\") pod \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.716566 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca\") pod \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\" (UID: \"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b\") " Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.717387 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca" (OuterVolumeSpecName: "serviceca") pod "f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" (UID: "f09bc4be-bc94-4c63-93ec-4bc2fef07d1b"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.720117 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987" (OuterVolumeSpecName: "kube-api-access-qc987") pod "f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" (UID: "f09bc4be-bc94-4c63-93ec-4bc2fef07d1b"). InnerVolumeSpecName "kube-api-access-qc987". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.818482 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc987\" (UniqueName: \"kubernetes.io/projected/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-kube-api-access-qc987\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.818561 4824 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09bc4be-bc94-4c63-93ec-4bc2fef07d1b-serviceca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.950793 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29531520-969xh" event={"ID":"f09bc4be-bc94-4c63-93ec-4bc2fef07d1b","Type":"ContainerDied","Data":"4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43"} Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.951234 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fbc9ac5dc3c69b5711041434cb98b9bdb56115d74b5092502a2732ff4babe43" Feb 24 00:10:16 crc kubenswrapper[4824]: I0224 00:10:16.950847 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29531520-969xh" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.552567 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 00:10:20 crc kubenswrapper[4824]: E0224 00:10:20.553750 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" containerName="image-pruner" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.553768 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" containerName="image-pruner" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.553928 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09bc4be-bc94-4c63-93ec-4bc2fef07d1b" containerName="image-pruner" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.554530 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.566107 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.680415 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.680539 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.680579 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.781655 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.781724 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.781819 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.781877 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.781994 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.820666 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access\") pod \"installer-9-crc\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:20 crc kubenswrapper[4824]: I0224 00:10:20.885892 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:10:21 crc kubenswrapper[4824]: E0224 00:10:21.387123 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 00:10:21 crc kubenswrapper[4824]: E0224 00:10:21.387346 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fhvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gl27t_openshift-marketplace(2da73289-3f96-4828-a106-46c3b0469e7d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:21 crc kubenswrapper[4824]: E0224 00:10:21.388496 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gl27t" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" Feb 24 00:10:22 crc kubenswrapper[4824]: E0224 00:10:22.909597 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gl27t" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" Feb 24 00:10:23 crc kubenswrapper[4824]: E0224 00:10:23.006227 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 00:10:23 crc kubenswrapper[4824]: E0224 00:10:23.006450 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c94kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6kgrd_openshift-marketplace(b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:23 crc kubenswrapper[4824]: E0224 00:10:23.008081 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6kgrd" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.276782 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.276906 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.277003 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.278185 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.278357 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3" gracePeriod=600 Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.993707 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3" exitCode=0 Feb 24 00:10:23 crc kubenswrapper[4824]: I0224 00:10:23.993794 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3"} Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.155199 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6kgrd" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.221452 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.221668 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hggqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nzqwf_openshift-marketplace(b142d96b-87c3-444b-b135-fdddaa658234): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.224288 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nzqwf" podUID="b142d96b-87c3-444b-b135-fdddaa658234" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.247208 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.247400 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zc6ds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zxplg_openshift-marketplace(7a78c7d6-6ec6-4857-af87-25c5c8cf961d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:24 crc kubenswrapper[4824]: E0224 00:10:24.249636 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zxplg" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.556697 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nzqwf" podUID="b142d96b-87c3-444b-b135-fdddaa658234" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.556738 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zxplg" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.657613 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.657815 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t895b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bfhcg_openshift-marketplace(b00860ed-9085-40bb-9041-16eac6d88fb1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.659382 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bfhcg" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.696897 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.697297 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndbcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hhftg_openshift-marketplace(3e306ddf-071d-47f2-b9b1-bf772963438e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.698508 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hhftg" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.709881 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.710069 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8k4qw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mfzkw_openshift-marketplace(08de7fe0-2d54-408b-8e09-3e1b9bcf931a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.715052 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mfzkw" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.724244 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.727429 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxnlm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dmjz7_openshift-marketplace(cc119514-5c95-4925-8a1a-3e6844a34e1e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:10:25 crc kubenswrapper[4824]: E0224 00:10:25.730909 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dmjz7" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" Feb 24 00:10:25 crc kubenswrapper[4824]: I0224 00:10:25.879592 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 00:10:25 crc kubenswrapper[4824]: I0224 00:10:25.937652 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 00:10:25 crc kubenswrapper[4824]: I0224 00:10:25.986719 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.007791 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a3174486-c5bc-4ef6-925d-70554d62d1f9","Type":"ContainerStarted","Data":"1c14510da4c8e4a0e7f0e53a8715b277a09e7b810b82c332501cf22906c6321c"} Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.010125 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94"} Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.017928 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"466928f3-88e1-4111-8358-13db2bd5ba58","Type":"ContainerStarted","Data":"3e14bb11973a03d1681cf1c9d6b14d165f03b40a1a2b66d541ff08ad0753d14f"} Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.021258 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r4c4b" event={"ID":"581e69ae-c21a-4a9e-b1ea-9c38256d7b30","Type":"ContainerStarted","Data":"5380aecc20f70dc2f48069e8ddfce36ada45db6c57f2622f028b0cec6c77999b"} Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.022128 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.024713 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.024785 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:26 crc kubenswrapper[4824]: E0224 00:10:26.036931 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hhftg" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" Feb 24 00:10:26 crc kubenswrapper[4824]: E0224 00:10:26.037309 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bfhcg" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" Feb 24 00:10:26 crc kubenswrapper[4824]: E0224 00:10:26.037378 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mfzkw" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" Feb 24 00:10:26 crc kubenswrapper[4824]: E0224 00:10:26.037446 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dmjz7" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.383319 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.383346 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.383849 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:26 crc kubenswrapper[4824]: I0224 00:10:26.383911 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.048321 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"466928f3-88e1-4111-8358-13db2bd5ba58","Type":"ContainerStarted","Data":"713672e1f8b7451df74461c3a60e57bbab1ffd950fc9f64d1a805ac3787f3127"} Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.054108 4824 generic.go:334] "Generic (PLEG): container finished" podID="a3174486-c5bc-4ef6-925d-70554d62d1f9" containerID="33248aff8f19a69ca84b1e0b96ee793ea313f709995ac8ddf21c077968256a9e" exitCode=0 Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.054405 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a3174486-c5bc-4ef6-925d-70554d62d1f9","Type":"ContainerDied","Data":"33248aff8f19a69ca84b1e0b96ee793ea313f709995ac8ddf21c077968256a9e"} Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.055855 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.055910 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:27 crc kubenswrapper[4824]: I0224 00:10:27.072860 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.07283417 podStartE2EDuration="7.07283417s" podCreationTimestamp="2026-02-24 00:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:10:27.071118153 +0000 UTC m=+291.060742622" watchObservedRunningTime="2026-02-24 00:10:27.07283417 +0000 UTC m=+291.062458629" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.061371 4824 patch_prober.go:28] interesting pod/downloads-7954f5f757-r4c4b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.061937 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r4c4b" podUID="581e69ae-c21a-4a9e-b1ea-9c38256d7b30" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.313365 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.400349 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir\") pod \"a3174486-c5bc-4ef6-925d-70554d62d1f9\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.400452 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access\") pod \"a3174486-c5bc-4ef6-925d-70554d62d1f9\" (UID: \"a3174486-c5bc-4ef6-925d-70554d62d1f9\") " Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.400575 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a3174486-c5bc-4ef6-925d-70554d62d1f9" (UID: "a3174486-c5bc-4ef6-925d-70554d62d1f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.400819 4824 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3174486-c5bc-4ef6-925d-70554d62d1f9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.409922 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a3174486-c5bc-4ef6-925d-70554d62d1f9" (UID: "a3174486-c5bc-4ef6-925d-70554d62d1f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:28 crc kubenswrapper[4824]: I0224 00:10:28.502250 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3174486-c5bc-4ef6-925d-70554d62d1f9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:29 crc kubenswrapper[4824]: I0224 00:10:29.070876 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a3174486-c5bc-4ef6-925d-70554d62d1f9","Type":"ContainerDied","Data":"1c14510da4c8e4a0e7f0e53a8715b277a09e7b810b82c332501cf22906c6321c"} Feb 24 00:10:29 crc kubenswrapper[4824]: I0224 00:10:29.071411 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c14510da4c8e4a0e7f0e53a8715b277a09e7b810b82c332501cf22906c6321c" Feb 24 00:10:29 crc kubenswrapper[4824]: I0224 00:10:29.070984 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 00:10:36 crc kubenswrapper[4824]: I0224 00:10:36.397026 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-r4c4b" Feb 24 00:10:36 crc kubenswrapper[4824]: I0224 00:10:36.478254 4824 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 24 00:10:37 crc kubenswrapper[4824]: I0224 00:10:37.227874 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:10:37 crc kubenswrapper[4824]: I0224 00:10:37.228323 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" podUID="c9ca581b-1f92-4494-ab07-3c56396e862c" containerName="controller-manager" containerID="cri-o://61152d725e3742568d9637a36b75cafdf6279903aa2b72696fb965f86fc0262a" gracePeriod=30 Feb 24 00:10:37 crc kubenswrapper[4824]: I0224 00:10:37.337332 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:10:37 crc kubenswrapper[4824]: I0224 00:10:37.338279 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" podUID="921fd719-248a-40f2-901e-de82a8c6b9bc" containerName="route-controller-manager" containerID="cri-o://d35da3b92a34ebadb664ea295a072909420d8af61a251763865432159776fa7d" gracePeriod=30 Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.123784 4824 generic.go:334] "Generic (PLEG): container finished" podID="921fd719-248a-40f2-901e-de82a8c6b9bc" containerID="d35da3b92a34ebadb664ea295a072909420d8af61a251763865432159776fa7d" exitCode=0 Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.124133 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" event={"ID":"921fd719-248a-40f2-901e-de82a8c6b9bc","Type":"ContainerDied","Data":"d35da3b92a34ebadb664ea295a072909420d8af61a251763865432159776fa7d"} Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.126842 4824 generic.go:334] "Generic (PLEG): container finished" podID="c9ca581b-1f92-4494-ab07-3c56396e862c" containerID="61152d725e3742568d9637a36b75cafdf6279903aa2b72696fb965f86fc0262a" exitCode=0 Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.126887 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" event={"ID":"c9ca581b-1f92-4494-ab07-3c56396e862c","Type":"ContainerDied","Data":"61152d725e3742568d9637a36b75cafdf6279903aa2b72696fb965f86fc0262a"} Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.291993 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.337828 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:39 crc kubenswrapper[4824]: E0224 00:10:39.338719 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="921fd719-248a-40f2-901e-de82a8c6b9bc" containerName="route-controller-manager" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.338751 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="921fd719-248a-40f2-901e-de82a8c6b9bc" containerName="route-controller-manager" Feb 24 00:10:39 crc kubenswrapper[4824]: E0224 00:10:39.338769 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3174486-c5bc-4ef6-925d-70554d62d1f9" containerName="pruner" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.338782 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3174486-c5bc-4ef6-925d-70554d62d1f9" containerName="pruner" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.338916 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="921fd719-248a-40f2-901e-de82a8c6b9bc" containerName="route-controller-manager" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.338936 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3174486-c5bc-4ef6-925d-70554d62d1f9" containerName="pruner" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.339481 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.345567 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.380995 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64rs6\" (UniqueName: \"kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6\") pod \"921fd719-248a-40f2-901e-de82a8c6b9bc\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.382192 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config\") pod \"921fd719-248a-40f2-901e-de82a8c6b9bc\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.382264 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca\") pod \"921fd719-248a-40f2-901e-de82a8c6b9bc\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.382379 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert\") pod \"921fd719-248a-40f2-901e-de82a8c6b9bc\" (UID: \"921fd719-248a-40f2-901e-de82a8c6b9bc\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.383163 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "921fd719-248a-40f2-901e-de82a8c6b9bc" (UID: "921fd719-248a-40f2-901e-de82a8c6b9bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.383508 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config" (OuterVolumeSpecName: "config") pod "921fd719-248a-40f2-901e-de82a8c6b9bc" (UID: "921fd719-248a-40f2-901e-de82a8c6b9bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.388231 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6" (OuterVolumeSpecName: "kube-api-access-64rs6") pod "921fd719-248a-40f2-901e-de82a8c6b9bc" (UID: "921fd719-248a-40f2-901e-de82a8c6b9bc"). InnerVolumeSpecName "kube-api-access-64rs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.395231 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "921fd719-248a-40f2-901e-de82a8c6b9bc" (UID: "921fd719-248a-40f2-901e-de82a8c6b9bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.484141 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.484441 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.484619 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.484713 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45tz4\" (UniqueName: \"kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.485233 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921fd719-248a-40f2-901e-de82a8c6b9bc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.485289 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64rs6\" (UniqueName: \"kubernetes.io/projected/921fd719-248a-40f2-901e-de82a8c6b9bc-kube-api-access-64rs6\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.485316 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.485329 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/921fd719-248a-40f2-901e-de82a8c6b9bc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.586770 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.586821 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.586851 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45tz4\" (UniqueName: \"kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.586947 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.587973 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.589953 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.591665 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.604296 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45tz4\" (UniqueName: \"kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4\") pod \"route-controller-manager-7f4db755ff-dfrn8\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.627363 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.665944 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.789596 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca\") pod \"c9ca581b-1f92-4494-ab07-3c56396e862c\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.790089 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert\") pod \"c9ca581b-1f92-4494-ab07-3c56396e862c\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.790144 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles\") pod \"c9ca581b-1f92-4494-ab07-3c56396e862c\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.790236 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config\") pod \"c9ca581b-1f92-4494-ab07-3c56396e862c\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.790266 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8k4h\" (UniqueName: \"kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h\") pod \"c9ca581b-1f92-4494-ab07-3c56396e862c\" (UID: \"c9ca581b-1f92-4494-ab07-3c56396e862c\") " Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.791818 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9ca581b-1f92-4494-ab07-3c56396e862c" (UID: "c9ca581b-1f92-4494-ab07-3c56396e862c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.793665 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c9ca581b-1f92-4494-ab07-3c56396e862c" (UID: "c9ca581b-1f92-4494-ab07-3c56396e862c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.794149 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config" (OuterVolumeSpecName: "config") pod "c9ca581b-1f92-4494-ab07-3c56396e862c" (UID: "c9ca581b-1f92-4494-ab07-3c56396e862c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.797885 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h" (OuterVolumeSpecName: "kube-api-access-s8k4h") pod "c9ca581b-1f92-4494-ab07-3c56396e862c" (UID: "c9ca581b-1f92-4494-ab07-3c56396e862c"). InnerVolumeSpecName "kube-api-access-s8k4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.798256 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9ca581b-1f92-4494-ab07-3c56396e862c" (UID: "c9ca581b-1f92-4494-ab07-3c56396e862c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.892830 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.892892 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8k4h\" (UniqueName: \"kubernetes.io/projected/c9ca581b-1f92-4494-ab07-3c56396e862c-kube-api-access-s8k4h\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.892910 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.892922 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9ca581b-1f92-4494-ab07-3c56396e862c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:39 crc kubenswrapper[4824]: I0224 00:10:39.892931 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9ca581b-1f92-4494-ab07-3c56396e862c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.145638 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.149285 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.149688 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-599d8ff48-qktrf" event={"ID":"c9ca581b-1f92-4494-ab07-3c56396e862c","Type":"ContainerDied","Data":"3348d9f9c7ede94db1fa63c997fae65b1355a60f270f2e59b5e41ad0ae9f9767"} Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.149856 4824 scope.go:117] "RemoveContainer" containerID="61152d725e3742568d9637a36b75cafdf6279903aa2b72696fb965f86fc0262a" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.153472 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerStarted","Data":"09e81517976ec38b505938bb2df2f3b6123c4b30e798621cb83825dcef2c35b1"} Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.164360 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.164341 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw" event={"ID":"921fd719-248a-40f2-901e-de82a8c6b9bc","Type":"ContainerDied","Data":"ebe73061062b5fb6815834caff328cf47e5bdaff5958c1bf3a4054a250e190c9"} Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.177475 4824 scope.go:117] "RemoveContainer" containerID="d35da3b92a34ebadb664ea295a072909420d8af61a251763865432159776fa7d" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.223058 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.230429 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7968fcb589-gvmfw"] Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.234401 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.236822 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-599d8ff48-qktrf"] Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.708284 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921fd719-248a-40f2-901e-de82a8c6b9bc" path="/var/lib/kubelet/pods/921fd719-248a-40f2-901e-de82a8c6b9bc/volumes" Feb 24 00:10:40 crc kubenswrapper[4824]: I0224 00:10:40.708911 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ca581b-1f92-4494-ab07-3c56396e862c" path="/var/lib/kubelet/pods/c9ca581b-1f92-4494-ab07-3c56396e862c/volumes" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.171829 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" event={"ID":"1c800062-d998-4df3-97e1-ca5df1a57de9","Type":"ContainerStarted","Data":"b96e0735ee158d70b84382ad4a8a1094ebe1136cbc03596e96e9bd01b1c192ba"} Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.171908 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" event={"ID":"1c800062-d998-4df3-97e1-ca5df1a57de9","Type":"ContainerStarted","Data":"0a7a42e872a1ae8acc7406739cad82b1452bea819a0d086289fe5c4eb6a595fd"} Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.175355 4824 generic.go:334] "Generic (PLEG): container finished" podID="2da73289-3f96-4828-a106-46c3b0469e7d" containerID="09e81517976ec38b505938bb2df2f3b6123c4b30e798621cb83825dcef2c35b1" exitCode=0 Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.175426 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerDied","Data":"09e81517976ec38b505938bb2df2f3b6123c4b30e798621cb83825dcef2c35b1"} Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.176748 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerStarted","Data":"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0"} Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.180145 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerStarted","Data":"8bd5382363dfe954b11d2958183ea67ba5ab63752a6364c784c6c9e09c7286e0"} Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.980752 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:41 crc kubenswrapper[4824]: E0224 00:10:41.981450 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ca581b-1f92-4494-ab07-3c56396e862c" containerName="controller-manager" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.981475 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ca581b-1f92-4494-ab07-3c56396e862c" containerName="controller-manager" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.981725 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ca581b-1f92-4494-ab07-3c56396e862c" containerName="controller-manager" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.982345 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.985274 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.987032 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.987468 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.987642 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.987828 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.988601 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.995318 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:41 crc kubenswrapper[4824]: I0224 00:10:41.995964 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.137422 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.137600 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.137648 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfx2m\" (UniqueName: \"kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.137715 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.137976 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.188259 4824 generic.go:334] "Generic (PLEG): container finished" podID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerID="c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0" exitCode=0 Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.188341 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerDied","Data":"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0"} Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.191308 4824 generic.go:334] "Generic (PLEG): container finished" podID="b142d96b-87c3-444b-b135-fdddaa658234" containerID="8bd5382363dfe954b11d2958183ea67ba5ab63752a6364c784c6c9e09c7286e0" exitCode=0 Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.191425 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerDied","Data":"8bd5382363dfe954b11d2958183ea67ba5ab63752a6364c784c6c9e09c7286e0"} Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.191588 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.208214 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.239897 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.239976 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.240006 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfx2m\" (UniqueName: \"kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.240025 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.240062 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.242371 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.242805 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.243031 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.255607 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.262653 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" podStartSLOduration=5.262613921 podStartE2EDuration="5.262613921s" podCreationTimestamp="2026-02-24 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:10:42.256174706 +0000 UTC m=+306.245799195" watchObservedRunningTime="2026-02-24 00:10:42.262613921 +0000 UTC m=+306.252238400" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.264180 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfx2m\" (UniqueName: \"kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m\") pod \"controller-manager-6b978c5766-k8r5n\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:42 crc kubenswrapper[4824]: I0224 00:10:42.347383 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:49 crc kubenswrapper[4824]: I0224 00:10:49.637263 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.023851 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" containerID="cri-o://fc75bfe4b562302aad30993aa2a68489589d802790238c3eeef171430ffcd747" gracePeriod=15 Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.268270 4824 generic.go:334] "Generic (PLEG): container finished" podID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerID="fc75bfe4b562302aad30993aa2a68489589d802790238c3eeef171430ffcd747" exitCode=0 Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.268354 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" event={"ID":"6f8699c7-58f5-4a80-b5af-5403cb178676","Type":"ContainerDied","Data":"fc75bfe4b562302aad30993aa2a68489589d802790238c3eeef171430ffcd747"} Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.270300 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" event={"ID":"48b7664a-e47f-4e07-b650-093a751f389f","Type":"ContainerStarted","Data":"104c6e9bfb9c8cc307927700a237315280842a4940df005e0dfceb2b9121c433"} Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.417538 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.491850 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492173 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492198 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492226 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492254 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492270 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492290 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492313 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492337 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492354 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492376 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492399 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4dl5\" (UniqueName: \"kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492420 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.492890 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.493469 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir\") pod \"6f8699c7-58f5-4a80-b5af-5403cb178676\" (UID: \"6f8699c7-58f5-4a80-b5af-5403cb178676\") " Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.493683 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.493709 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.494001 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.497461 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.498081 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.503748 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.505400 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.505572 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.505923 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.505980 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5" (OuterVolumeSpecName: "kube-api-access-z4dl5") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "kube-api-access-z4dl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.506963 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.507700 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.507944 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.508716 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6f8699c7-58f5-4a80-b5af-5403cb178676" (UID: "6f8699c7-58f5-4a80-b5af-5403cb178676"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594312 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594358 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594373 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594388 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4dl5\" (UniqueName: \"kubernetes.io/projected/6f8699c7-58f5-4a80-b5af-5403cb178676-kube-api-access-z4dl5\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594402 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594417 4824 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594432 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594451 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594465 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594479 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594491 4824 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f8699c7-58f5-4a80-b5af-5403cb178676-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594506 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.594535 4824 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f8699c7-58f5-4a80-b5af-5403cb178676-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.990908 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-s5j25"] Feb 24 00:10:51 crc kubenswrapper[4824]: E0224 00:10:51.991205 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.991226 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.991365 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" containerName="oauth-openshift" Feb 24 00:10:51 crc kubenswrapper[4824]: I0224 00:10:51.991922 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.027826 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-s5j25"] Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100134 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100551 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100585 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100723 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100749 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100773 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100790 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100832 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100852 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-dir\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100880 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100916 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-policies\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100940 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqddq\" (UniqueName: \"kubernetes.io/projected/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-kube-api-access-vqddq\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.100960 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.101034 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202138 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202592 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202627 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202665 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202720 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202749 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202777 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202804 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202843 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202875 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-dir\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202929 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.202979 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-policies\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.203006 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqddq\" (UniqueName: \"kubernetes.io/projected/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-kube-api-access-vqddq\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.203034 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.203784 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.203871 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-service-ca\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.203880 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-dir\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.204176 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-audit-policies\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.204452 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.209212 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.209753 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.211807 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-error\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.216097 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-session\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.216412 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-router-certs\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.217113 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.220565 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-template-login\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.220614 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.230131 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqddq\" (UniqueName: \"kubernetes.io/projected/070c1558-fa58-45f4-9e1e-e4a7d6e21ee3-kube-api-access-vqddq\") pod \"oauth-openshift-7484f6b95f-s5j25\" (UID: \"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3\") " pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.278448 4824 generic.go:334] "Generic (PLEG): container finished" podID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerID="4336adaefce1f631229f06eda9fede5b34bd7e94028955471812962455639142" exitCode=0 Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.278544 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerDied","Data":"4336adaefce1f631229f06eda9fede5b34bd7e94028955471812962455639142"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.279883 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" event={"ID":"48b7664a-e47f-4e07-b650-093a751f389f","Type":"ContainerStarted","Data":"0f4168ffce744376cf910cb3ea91a3c95473e9ff3db2ca447e36bdb7404167f5"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.280156 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.283453 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerStarted","Data":"f3129bb41cd26ff02fd1b16661272cba00c8572d524dd0295795e4e681de10f0"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.289158 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerStarted","Data":"ae3090316a207f659563cb6daa67a1cc4d3c280950cb420d2cf6d0ddebc465d5"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.292185 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.292484 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" event={"ID":"6f8699c7-58f5-4a80-b5af-5403cb178676","Type":"ContainerDied","Data":"ac2733fb1a358b53d6cecdc04c18db6dd2ffab884268bd9f970b2082f8018667"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.292531 4824 scope.go:117] "RemoveContainer" containerID="fc75bfe4b562302aad30993aa2a68489589d802790238c3eeef171430ffcd747" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.293231 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jf5jw" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.299133 4824 generic.go:334] "Generic (PLEG): container finished" podID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerID="3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961" exitCode=0 Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.299205 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerDied","Data":"3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.320279 4824 generic.go:334] "Generic (PLEG): container finished" podID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerID="49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f" exitCode=0 Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.320365 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerDied","Data":"49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.343493 4824 generic.go:334] "Generic (PLEG): container finished" podID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerID="25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999" exitCode=0 Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.343607 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerDied","Data":"25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.353647 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" podStartSLOduration=15.353624148 podStartE2EDuration="15.353624148s" podCreationTimestamp="2026-02-24 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:10:52.353101854 +0000 UTC m=+316.342726343" watchObservedRunningTime="2026-02-24 00:10:52.353624148 +0000 UTC m=+316.343248617" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.359900 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerStarted","Data":"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.400882 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.401229 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerStarted","Data":"d8cb34947ec733a964cff732c9bb70c2d8c98ea3a605270b5ec9f8c81b631a37"} Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.432216 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gl27t" podStartSLOduration=3.750445131 podStartE2EDuration="1m12.43219018s" podCreationTimestamp="2026-02-24 00:09:40 +0000 UTC" firstStartedPulling="2026-02-24 00:09:42.387644297 +0000 UTC m=+246.377268756" lastFinishedPulling="2026-02-24 00:10:51.069389336 +0000 UTC m=+315.059013805" observedRunningTime="2026-02-24 00:10:52.394935534 +0000 UTC m=+316.384560013" watchObservedRunningTime="2026-02-24 00:10:52.43219018 +0000 UTC m=+316.421814649" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.495060 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mfzkw" podStartSLOduration=3.71584468 podStartE2EDuration="1m13.495023942s" podCreationTimestamp="2026-02-24 00:09:39 +0000 UTC" firstStartedPulling="2026-02-24 00:09:41.316761038 +0000 UTC m=+245.306385517" lastFinishedPulling="2026-02-24 00:10:51.09594031 +0000 UTC m=+315.085564779" observedRunningTime="2026-02-24 00:10:52.471010608 +0000 UTC m=+316.460635087" watchObservedRunningTime="2026-02-24 00:10:52.495023942 +0000 UTC m=+316.484648411" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.495719 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nzqwf" podStartSLOduration=3.669338203 podStartE2EDuration="1m13.495712681s" podCreationTimestamp="2026-02-24 00:09:39 +0000 UTC" firstStartedPulling="2026-02-24 00:09:41.269458639 +0000 UTC m=+245.259083108" lastFinishedPulling="2026-02-24 00:10:51.095833117 +0000 UTC m=+315.085457586" observedRunningTime="2026-02-24 00:10:52.494118237 +0000 UTC m=+316.483742706" watchObservedRunningTime="2026-02-24 00:10:52.495712681 +0000 UTC m=+316.485337150" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.513091 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.520496 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jf5jw"] Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.720307 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8699c7-58f5-4a80-b5af-5403cb178676" path="/var/lib/kubelet/pods/6f8699c7-58f5-4a80-b5af-5403cb178676/volumes" Feb 24 00:10:52 crc kubenswrapper[4824]: I0224 00:10:52.953128 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7484f6b95f-s5j25"] Feb 24 00:10:52 crc kubenswrapper[4824]: W0224 00:10:52.959180 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod070c1558_fa58_45f4_9e1e_e4a7d6e21ee3.slice/crio-0f3830e47bcb43566b64917b2f2d3c488062533e57fbfa30dfa8c6b847b5dc00 WatchSource:0}: Error finding container 0f3830e47bcb43566b64917b2f2d3c488062533e57fbfa30dfa8c6b847b5dc00: Status 404 returned error can't find the container with id 0f3830e47bcb43566b64917b2f2d3c488062533e57fbfa30dfa8c6b847b5dc00 Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.409241 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerStarted","Data":"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.412268 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerStarted","Data":"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.415392 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerStarted","Data":"83d0a00bbb287f8c717cb0e93e56c8a769b62fbe8a1114585fcf0819cddb1d85"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.417853 4824 generic.go:334] "Generic (PLEG): container finished" podID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerID="ae3090316a207f659563cb6daa67a1cc4d3c280950cb420d2cf6d0ddebc465d5" exitCode=0 Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.417920 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerDied","Data":"ae3090316a207f659563cb6daa67a1cc4d3c280950cb420d2cf6d0ddebc465d5"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.420615 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" event={"ID":"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3","Type":"ContainerStarted","Data":"0894c00150287c8fef9770d78ff81ad105a8a7bfb8ab9033aea7214f4baad5be"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.420660 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" event={"ID":"070c1558-fa58-45f4-9e1e-e4a7d6e21ee3","Type":"ContainerStarted","Data":"0f3830e47bcb43566b64917b2f2d3c488062533e57fbfa30dfa8c6b847b5dc00"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.421011 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.422429 4824 patch_prober.go:28] interesting pod/oauth-openshift-7484f6b95f-s5j25 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.422460 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" podUID="070c1558-fa58-45f4-9e1e-e4a7d6e21ee3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.424594 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerStarted","Data":"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524"} Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.433722 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bfhcg" podStartSLOduration=3.925618017 podStartE2EDuration="1m16.433701666s" podCreationTimestamp="2026-02-24 00:09:37 +0000 UTC" firstStartedPulling="2026-02-24 00:09:40.253409546 +0000 UTC m=+244.243034015" lastFinishedPulling="2026-02-24 00:10:52.761493195 +0000 UTC m=+316.751117664" observedRunningTime="2026-02-24 00:10:53.433054178 +0000 UTC m=+317.422678647" watchObservedRunningTime="2026-02-24 00:10:53.433701666 +0000 UTC m=+317.423326135" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.454076 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6kgrd" podStartSLOduration=3.746534953 podStartE2EDuration="1m16.454050951s" podCreationTimestamp="2026-02-24 00:09:37 +0000 UTC" firstStartedPulling="2026-02-24 00:09:40.117168759 +0000 UTC m=+244.106793228" lastFinishedPulling="2026-02-24 00:10:52.824684757 +0000 UTC m=+316.814309226" observedRunningTime="2026-02-24 00:10:53.45254931 +0000 UTC m=+317.442173779" watchObservedRunningTime="2026-02-24 00:10:53.454050951 +0000 UTC m=+317.443675420" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.500822 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhftg" podStartSLOduration=3.939027639 podStartE2EDuration="1m16.500802575s" podCreationTimestamp="2026-02-24 00:09:37 +0000 UTC" firstStartedPulling="2026-02-24 00:09:40.117052686 +0000 UTC m=+244.106677155" lastFinishedPulling="2026-02-24 00:10:52.678827622 +0000 UTC m=+316.668452091" observedRunningTime="2026-02-24 00:10:53.499603902 +0000 UTC m=+317.489228381" watchObservedRunningTime="2026-02-24 00:10:53.500802575 +0000 UTC m=+317.490427044" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.525387 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dmjz7" podStartSLOduration=3.797124815 podStartE2EDuration="1m16.525370544s" podCreationTimestamp="2026-02-24 00:09:37 +0000 UTC" firstStartedPulling="2026-02-24 00:09:40.225108855 +0000 UTC m=+244.214733334" lastFinishedPulling="2026-02-24 00:10:52.953354594 +0000 UTC m=+316.942979063" observedRunningTime="2026-02-24 00:10:53.523486863 +0000 UTC m=+317.513111342" watchObservedRunningTime="2026-02-24 00:10:53.525370544 +0000 UTC m=+317.514995013" Feb 24 00:10:53 crc kubenswrapper[4824]: I0224 00:10:53.549168 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" podStartSLOduration=27.549142232 podStartE2EDuration="27.549142232s" podCreationTimestamp="2026-02-24 00:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:10:53.547095266 +0000 UTC m=+317.536719755" watchObservedRunningTime="2026-02-24 00:10:53.549142232 +0000 UTC m=+317.538766701" Feb 24 00:10:54 crc kubenswrapper[4824]: I0224 00:10:54.433924 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerStarted","Data":"da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518"} Feb 24 00:10:54 crc kubenswrapper[4824]: I0224 00:10:54.447940 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7484f6b95f-s5j25" Feb 24 00:10:54 crc kubenswrapper[4824]: I0224 00:10:54.466786 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxplg" podStartSLOduration=3.015280006 podStartE2EDuration="1m14.466765092s" podCreationTimestamp="2026-02-24 00:09:40 +0000 UTC" firstStartedPulling="2026-02-24 00:09:42.393581022 +0000 UTC m=+246.383205491" lastFinishedPulling="2026-02-24 00:10:53.845066108 +0000 UTC m=+317.834690577" observedRunningTime="2026-02-24 00:10:54.463177844 +0000 UTC m=+318.452802313" watchObservedRunningTime="2026-02-24 00:10:54.466765092 +0000 UTC m=+318.456389561" Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.250270 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.250577 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" podUID="48b7664a-e47f-4e07-b650-093a751f389f" containerName="controller-manager" containerID="cri-o://0f4168ffce744376cf910cb3ea91a3c95473e9ff3db2ca447e36bdb7404167f5" gracePeriod=30 Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.258902 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.259122 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" podUID="1c800062-d998-4df3-97e1-ca5df1a57de9" containerName="route-controller-manager" containerID="cri-o://b96e0735ee158d70b84382ad4a8a1094ebe1136cbc03596e96e9bd01b1c192ba" gracePeriod=30 Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.920655 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:10:57 crc kubenswrapper[4824]: I0224 00:10:57.920720 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.150120 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.150207 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.272547 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.272601 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.372756 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.372836 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.438252 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.438355 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.439116 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.439496 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.459938 4824 generic.go:334] "Generic (PLEG): container finished" podID="1c800062-d998-4df3-97e1-ca5df1a57de9" containerID="b96e0735ee158d70b84382ad4a8a1094ebe1136cbc03596e96e9bd01b1c192ba" exitCode=0 Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.460179 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" event={"ID":"1c800062-d998-4df3-97e1-ca5df1a57de9","Type":"ContainerDied","Data":"b96e0735ee158d70b84382ad4a8a1094ebe1136cbc03596e96e9bd01b1c192ba"} Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.462565 4824 generic.go:334] "Generic (PLEG): container finished" podID="48b7664a-e47f-4e07-b650-093a751f389f" containerID="0f4168ffce744376cf910cb3ea91a3c95473e9ff3db2ca447e36bdb7404167f5" exitCode=0 Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.463420 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" event={"ID":"48b7664a-e47f-4e07-b650-093a751f389f","Type":"ContainerDied","Data":"0f4168ffce744376cf910cb3ea91a3c95473e9ff3db2ca447e36bdb7404167f5"} Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.523689 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.532694 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.543713 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.544086 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.561269 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.641318 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx"] Feb 24 00:10:58 crc kubenswrapper[4824]: E0224 00:10:58.642306 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c800062-d998-4df3-97e1-ca5df1a57de9" containerName="route-controller-manager" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.642393 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c800062-d998-4df3-97e1-ca5df1a57de9" containerName="route-controller-manager" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.642596 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c800062-d998-4df3-97e1-ca5df1a57de9" containerName="route-controller-manager" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.643080 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.667957 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx"] Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.709276 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert\") pod \"1c800062-d998-4df3-97e1-ca5df1a57de9\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.709344 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config\") pod \"1c800062-d998-4df3-97e1-ca5df1a57de9\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.709420 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45tz4\" (UniqueName: \"kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4\") pod \"1c800062-d998-4df3-97e1-ca5df1a57de9\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.709476 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca\") pod \"1c800062-d998-4df3-97e1-ca5df1a57de9\" (UID: \"1c800062-d998-4df3-97e1-ca5df1a57de9\") " Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.710507 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca" (OuterVolumeSpecName: "client-ca") pod "1c800062-d998-4df3-97e1-ca5df1a57de9" (UID: "1c800062-d998-4df3-97e1-ca5df1a57de9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.710588 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config" (OuterVolumeSpecName: "config") pod "1c800062-d998-4df3-97e1-ca5df1a57de9" (UID: "1c800062-d998-4df3-97e1-ca5df1a57de9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.718639 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1c800062-d998-4df3-97e1-ca5df1a57de9" (UID: "1c800062-d998-4df3-97e1-ca5df1a57de9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.720727 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4" (OuterVolumeSpecName: "kube-api-access-45tz4") pod "1c800062-d998-4df3-97e1-ca5df1a57de9" (UID: "1c800062-d998-4df3-97e1-ca5df1a57de9"). InnerVolumeSpecName "kube-api-access-45tz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.810588 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4mw\" (UniqueName: \"kubernetes.io/projected/4fb39704-7b42-4cdb-8b97-a410aee2e71d-kube-api-access-tf4mw\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.810774 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-config\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.810833 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-client-ca\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.810901 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb39704-7b42-4cdb-8b97-a410aee2e71d-serving-cert\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.811134 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45tz4\" (UniqueName: \"kubernetes.io/projected/1c800062-d998-4df3-97e1-ca5df1a57de9-kube-api-access-45tz4\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.811191 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.811202 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c800062-d998-4df3-97e1-ca5df1a57de9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.811213 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c800062-d998-4df3-97e1-ca5df1a57de9-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.912284 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-config\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.912434 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-client-ca\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.913606 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-client-ca\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.913683 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb39704-7b42-4cdb-8b97-a410aee2e71d-config\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.913707 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb39704-7b42-4cdb-8b97-a410aee2e71d-serving-cert\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.914324 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4mw\" (UniqueName: \"kubernetes.io/projected/4fb39704-7b42-4cdb-8b97-a410aee2e71d-kube-api-access-tf4mw\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.917488 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb39704-7b42-4cdb-8b97-a410aee2e71d-serving-cert\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.936536 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4mw\" (UniqueName: \"kubernetes.io/projected/4fb39704-7b42-4cdb-8b97-a410aee2e71d-kube-api-access-tf4mw\") pod \"route-controller-manager-7646d7595b-xmkcx\" (UID: \"4fb39704-7b42-4cdb-8b97-a410aee2e71d\") " pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:58 crc kubenswrapper[4824]: I0224 00:10:58.970700 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.304654 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.421981 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config\") pod \"48b7664a-e47f-4e07-b650-093a751f389f\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.422052 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfx2m\" (UniqueName: \"kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m\") pod \"48b7664a-e47f-4e07-b650-093a751f389f\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.422077 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca\") pod \"48b7664a-e47f-4e07-b650-093a751f389f\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.422154 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert\") pod \"48b7664a-e47f-4e07-b650-093a751f389f\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.422199 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles\") pod \"48b7664a-e47f-4e07-b650-093a751f389f\" (UID: \"48b7664a-e47f-4e07-b650-093a751f389f\") " Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.423369 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "48b7664a-e47f-4e07-b650-093a751f389f" (UID: "48b7664a-e47f-4e07-b650-093a751f389f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.423578 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca" (OuterVolumeSpecName: "client-ca") pod "48b7664a-e47f-4e07-b650-093a751f389f" (UID: "48b7664a-e47f-4e07-b650-093a751f389f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.424577 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config" (OuterVolumeSpecName: "config") pod "48b7664a-e47f-4e07-b650-093a751f389f" (UID: "48b7664a-e47f-4e07-b650-093a751f389f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.426148 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m" (OuterVolumeSpecName: "kube-api-access-vfx2m") pod "48b7664a-e47f-4e07-b650-093a751f389f" (UID: "48b7664a-e47f-4e07-b650-093a751f389f"). InnerVolumeSpecName "kube-api-access-vfx2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.426869 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48b7664a-e47f-4e07-b650-093a751f389f" (UID: "48b7664a-e47f-4e07-b650-093a751f389f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.463954 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx"] Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.471296 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.472197 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b978c5766-k8r5n" event={"ID":"48b7664a-e47f-4e07-b650-093a751f389f","Type":"ContainerDied","Data":"104c6e9bfb9c8cc307927700a237315280842a4940df005e0dfceb2b9121c433"} Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.472258 4824 scope.go:117] "RemoveContainer" containerID="0f4168ffce744376cf910cb3ea91a3c95473e9ff3db2ca447e36bdb7404167f5" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.475392 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" event={"ID":"1c800062-d998-4df3-97e1-ca5df1a57de9","Type":"ContainerDied","Data":"0a7a42e872a1ae8acc7406739cad82b1452bea819a0d086289fe5c4eb6a595fd"} Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.475725 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.494134 4824 scope.go:117] "RemoveContainer" containerID="b96e0735ee158d70b84382ad4a8a1094ebe1136cbc03596e96e9bd01b1c192ba" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.511215 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.515206 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b978c5766-k8r5n"] Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.523997 4824 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.524036 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfx2m\" (UniqueName: \"kubernetes.io/projected/48b7664a-e47f-4e07-b650-093a751f389f-kube-api-access-vfx2m\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.524046 4824 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.524056 4824 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b7664a-e47f-4e07-b650-093a751f389f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.524064 4824 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b7664a-e47f-4e07-b650-093a751f389f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.527113 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.530766 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f4db755ff-dfrn8"] Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.664310 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.664387 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.709216 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.942564 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.942945 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:10:59 crc kubenswrapper[4824]: I0224 00:10:59.984587 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.483364 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" event={"ID":"4fb39704-7b42-4cdb-8b97-a410aee2e71d","Type":"ContainerStarted","Data":"a2952b5305c41a1d3a3a66df07e041f46c5d7f0968959a3cd17d893e4f11da6f"} Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.483415 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" event={"ID":"4fb39704-7b42-4cdb-8b97-a410aee2e71d","Type":"ContainerStarted","Data":"882c1288239b83f0062521860a302214b25cf57db5ec2c3d59d4f89576f3241c"} Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.483543 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.490059 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.500484 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7646d7595b-xmkcx" podStartSLOduration=3.500462959 podStartE2EDuration="3.500462959s" podCreationTimestamp="2026-02-24 00:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:11:00.499916554 +0000 UTC m=+324.489541033" watchObservedRunningTime="2026-02-24 00:11:00.500462959 +0000 UTC m=+324.490087428" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.523935 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.524182 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6kgrd" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="registry-server" containerID="cri-o://1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524" gracePeriod=2 Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.533849 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.533981 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.705021 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c800062-d998-4df3-97e1-ca5df1a57de9" path="/var/lib/kubelet/pods/1c800062-d998-4df3-97e1-ca5df1a57de9/volumes" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.705600 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b7664a-e47f-4e07-b650-093a751f389f" path="/var/lib/kubelet/pods/48b7664a-e47f-4e07-b650-093a751f389f/volumes" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.722274 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.722574 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bfhcg" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="registry-server" containerID="cri-o://740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80" gracePeriod=2 Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.802846 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.802900 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.844020 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.930923 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995054 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-748b548fc5-vnmjp"] Feb 24 00:11:00 crc kubenswrapper[4824]: E0224 00:11:00.995339 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b7664a-e47f-4e07-b650-093a751f389f" containerName="controller-manager" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995359 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b7664a-e47f-4e07-b650-093a751f389f" containerName="controller-manager" Feb 24 00:11:00 crc kubenswrapper[4824]: E0224 00:11:00.995378 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="registry-server" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995387 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="registry-server" Feb 24 00:11:00 crc kubenswrapper[4824]: E0224 00:11:00.995404 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="extract-content" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995413 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="extract-content" Feb 24 00:11:00 crc kubenswrapper[4824]: E0224 00:11:00.995429 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="extract-utilities" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995437 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="extract-utilities" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.995983 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b7664a-e47f-4e07-b650-093a751f389f" containerName="controller-manager" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.996010 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerName="registry-server" Feb 24 00:11:00 crc kubenswrapper[4824]: I0224 00:11:00.996578 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:00.998595 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:00.998800 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.001805 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.002763 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.007331 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748b548fc5-vnmjp"] Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.007670 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.020321 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.025478 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.045190 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content\") pod \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.045239 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94kl\" (UniqueName: \"kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl\") pod \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.045279 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities\") pod \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\" (UID: \"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.046189 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities" (OuterVolumeSpecName: "utilities") pod "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" (UID: "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.055888 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl" (OuterVolumeSpecName: "kube-api-access-c94kl") pod "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" (UID: "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4"). InnerVolumeSpecName "kube-api-access-c94kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.101380 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" (UID: "b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.132084 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.146929 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-proxy-ca-bundles\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.146993 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8jx\" (UniqueName: \"kubernetes.io/projected/d4767901-4638-4ac8-9c1e-7e61341ddc21-kube-api-access-mb8jx\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.147189 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-config\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.147311 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-client-ca\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.147400 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4767901-4638-4ac8-9c1e-7e61341ddc21-serving-cert\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.147560 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.148059 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94kl\" (UniqueName: \"kubernetes.io/projected/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-kube-api-access-c94kl\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.148076 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.249731 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t895b\" (UniqueName: \"kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b\") pod \"b00860ed-9085-40bb-9041-16eac6d88fb1\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.250409 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content\") pod \"b00860ed-9085-40bb-9041-16eac6d88fb1\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.250803 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities\") pod \"b00860ed-9085-40bb-9041-16eac6d88fb1\" (UID: \"b00860ed-9085-40bb-9041-16eac6d88fb1\") " Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251303 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-proxy-ca-bundles\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251424 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8jx\" (UniqueName: \"kubernetes.io/projected/d4767901-4638-4ac8-9c1e-7e61341ddc21-kube-api-access-mb8jx\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251570 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-config\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251726 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-client-ca\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251844 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4767901-4638-4ac8-9c1e-7e61341ddc21-serving-cert\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.251862 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities" (OuterVolumeSpecName: "utilities") pod "b00860ed-9085-40bb-9041-16eac6d88fb1" (UID: "b00860ed-9085-40bb-9041-16eac6d88fb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.253014 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-client-ca\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.253102 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-proxy-ca-bundles\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.253734 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4767901-4638-4ac8-9c1e-7e61341ddc21-config\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.257830 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b" (OuterVolumeSpecName: "kube-api-access-t895b") pod "b00860ed-9085-40bb-9041-16eac6d88fb1" (UID: "b00860ed-9085-40bb-9041-16eac6d88fb1"). InnerVolumeSpecName "kube-api-access-t895b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.258628 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4767901-4638-4ac8-9c1e-7e61341ddc21-serving-cert\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.270703 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8jx\" (UniqueName: \"kubernetes.io/projected/d4767901-4638-4ac8-9c1e-7e61341ddc21-kube-api-access-mb8jx\") pod \"controller-manager-748b548fc5-vnmjp\" (UID: \"d4767901-4638-4ac8-9c1e-7e61341ddc21\") " pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.307463 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b00860ed-9085-40bb-9041-16eac6d88fb1" (UID: "b00860ed-9085-40bb-9041-16eac6d88fb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.345687 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.352894 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.352952 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b00860ed-9085-40bb-9041-16eac6d88fb1-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.352962 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t895b\" (UniqueName: \"kubernetes.io/projected/b00860ed-9085-40bb-9041-16eac6d88fb1-kube-api-access-t895b\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.415372 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.415436 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.504584 4824 generic.go:334] "Generic (PLEG): container finished" podID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" containerID="1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524" exitCode=0 Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.504678 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerDied","Data":"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524"} Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.505040 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kgrd" event={"ID":"b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4","Type":"ContainerDied","Data":"a33e62fe6f2549eb1208d3cf356835348bf6325f74507046a34d8f566aaa9f3c"} Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.505067 4824 scope.go:117] "RemoveContainer" containerID="1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.504710 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kgrd" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.511373 4824 generic.go:334] "Generic (PLEG): container finished" podID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerID="740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80" exitCode=0 Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.511485 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bfhcg" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.511608 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerDied","Data":"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80"} Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.511651 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bfhcg" event={"ID":"b00860ed-9085-40bb-9041-16eac6d88fb1","Type":"ContainerDied","Data":"9c018af5a403cd93073513dceb16d5cd69816c1726faff3f2cab188f2753d464"} Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.519675 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.538647 4824 scope.go:117] "RemoveContainer" containerID="25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.579479 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.582492 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.582620 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.592787 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bfhcg"] Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.599783 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.601652 4824 scope.go:117] "RemoveContainer" containerID="8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.603917 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6kgrd"] Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.629388 4824 scope.go:117] "RemoveContainer" containerID="1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.630153 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524\": container with ID starting with 1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524 not found: ID does not exist" containerID="1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.630187 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524"} err="failed to get container status \"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524\": rpc error: code = NotFound desc = could not find container \"1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524\": container with ID starting with 1d7b69ec3003cc3743308fff410ebd338aa2b0cc906c10d99169f9adff2cb524 not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.630212 4824 scope.go:117] "RemoveContainer" containerID="25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.630612 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999\": container with ID starting with 25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999 not found: ID does not exist" containerID="25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.630665 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999"} err="failed to get container status \"25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999\": rpc error: code = NotFound desc = could not find container \"25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999\": container with ID starting with 25d34726cfc81bf777c29be29729340267c9cddf9a761fe641537d6b9d860999 not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.630699 4824 scope.go:117] "RemoveContainer" containerID="8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.631113 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475\": container with ID starting with 8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475 not found: ID does not exist" containerID="8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.631141 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475"} err="failed to get container status \"8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475\": rpc error: code = NotFound desc = could not find container \"8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475\": container with ID starting with 8bc4c4e74760164a686d72949257ade00d0c3cfa06f89e62b82604ab57ead475 not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.631157 4824 scope.go:117] "RemoveContainer" containerID="740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.652339 4824 scope.go:117] "RemoveContainer" containerID="3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.692314 4824 scope.go:117] "RemoveContainer" containerID="b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.708677 4824 scope.go:117] "RemoveContainer" containerID="740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.709317 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80\": container with ID starting with 740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80 not found: ID does not exist" containerID="740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.709361 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80"} err="failed to get container status \"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80\": rpc error: code = NotFound desc = could not find container \"740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80\": container with ID starting with 740ff9f30206ac3dc9b4d8ae780788f1992408f2e27b01ff6a0bd56b96684f80 not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.709393 4824 scope.go:117] "RemoveContainer" containerID="3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.709705 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961\": container with ID starting with 3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961 not found: ID does not exist" containerID="3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.709729 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961"} err="failed to get container status \"3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961\": rpc error: code = NotFound desc = could not find container \"3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961\": container with ID starting with 3ea11f300330e655d35201dc1283dc47672fa1f01b0818a96fdef96fb4638961 not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.709750 4824 scope.go:117] "RemoveContainer" containerID="b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba" Feb 24 00:11:01 crc kubenswrapper[4824]: E0224 00:11:01.710270 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba\": container with ID starting with b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba not found: ID does not exist" containerID="b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.710302 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba"} err="failed to get container status \"b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba\": rpc error: code = NotFound desc = could not find container \"b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba\": container with ID starting with b1c2dccdb4d30a9f6f73707633f7236dc1482d098520afb3ff0820f1bf1efcba not found: ID does not exist" Feb 24 00:11:01 crc kubenswrapper[4824]: I0224 00:11:01.831037 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748b548fc5-vnmjp"] Feb 24 00:11:01 crc kubenswrapper[4824]: W0224 00:11:01.839109 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4767901_4638_4ac8_9c1e_7e61341ddc21.slice/crio-6faf23bbaf504552323f49afcdd6447816bf7709fe5e77f695a1faa41bfc90d0 WatchSource:0}: Error finding container 6faf23bbaf504552323f49afcdd6447816bf7709fe5e77f695a1faa41bfc90d0: Status 404 returned error can't find the container with id 6faf23bbaf504552323f49afcdd6447816bf7709fe5e77f695a1faa41bfc90d0 Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.518203 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" event={"ID":"d4767901-4638-4ac8-9c1e-7e61341ddc21","Type":"ContainerStarted","Data":"d5368730b964ed48604f2db0395f014d1318cc9a4881f28e1ada0f47b8ce9393"} Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.518252 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" event={"ID":"d4767901-4638-4ac8-9c1e-7e61341ddc21","Type":"ContainerStarted","Data":"6faf23bbaf504552323f49afcdd6447816bf7709fe5e77f695a1faa41bfc90d0"} Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.518474 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.524615 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.536073 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-748b548fc5-vnmjp" podStartSLOduration=5.536049602 podStartE2EDuration="5.536049602s" podCreationTimestamp="2026-02-24 00:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:11:02.535082935 +0000 UTC m=+326.524707404" watchObservedRunningTime="2026-02-24 00:11:02.536049602 +0000 UTC m=+326.525674081" Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.706034 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" path="/var/lib/kubelet/pods/b00860ed-9085-40bb-9041-16eac6d88fb1/volumes" Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.707663 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4" path="/var/lib/kubelet/pods/b8af18e1-1420-44fb-b1dc-0d0fb8f6aaf4/volumes" Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.920975 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:11:02 crc kubenswrapper[4824]: I0224 00:11:02.921245 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mfzkw" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="registry-server" containerID="cri-o://508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c" gracePeriod=2 Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.314335 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.386135 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content\") pod \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.386222 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4qw\" (UniqueName: \"kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw\") pod \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.386274 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities\") pod \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\" (UID: \"08de7fe0-2d54-408b-8e09-3e1b9bcf931a\") " Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.387475 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities" (OuterVolumeSpecName: "utilities") pod "08de7fe0-2d54-408b-8e09-3e1b9bcf931a" (UID: "08de7fe0-2d54-408b-8e09-3e1b9bcf931a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.399777 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw" (OuterVolumeSpecName: "kube-api-access-8k4qw") pod "08de7fe0-2d54-408b-8e09-3e1b9bcf931a" (UID: "08de7fe0-2d54-408b-8e09-3e1b9bcf931a"). InnerVolumeSpecName "kube-api-access-8k4qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.418191 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08de7fe0-2d54-408b-8e09-3e1b9bcf931a" (UID: "08de7fe0-2d54-408b-8e09-3e1b9bcf931a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.487539 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.487574 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4qw\" (UniqueName: \"kubernetes.io/projected/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-kube-api-access-8k4qw\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.487585 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08de7fe0-2d54-408b-8e09-3e1b9bcf931a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.530569 4824 generic.go:334] "Generic (PLEG): container finished" podID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerID="508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c" exitCode=0 Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.530641 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerDied","Data":"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c"} Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.530669 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mfzkw" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.530712 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mfzkw" event={"ID":"08de7fe0-2d54-408b-8e09-3e1b9bcf931a","Type":"ContainerDied","Data":"eef69387238650e8470f3abae2d3e6234452b8da7d8847435def291fdad9a1d8"} Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.530735 4824 scope.go:117] "RemoveContainer" containerID="508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.556682 4824 scope.go:117] "RemoveContainer" containerID="c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.563420 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.565456 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mfzkw"] Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.590534 4824 scope.go:117] "RemoveContainer" containerID="095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.607974 4824 scope.go:117] "RemoveContainer" containerID="508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c" Feb 24 00:11:03 crc kubenswrapper[4824]: E0224 00:11:03.608925 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c\": container with ID starting with 508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c not found: ID does not exist" containerID="508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.608996 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c"} err="failed to get container status \"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c\": rpc error: code = NotFound desc = could not find container \"508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c\": container with ID starting with 508ed9b0876f895bd93f2220b145fd5e2693472af97b22cae0f711fdc293c35c not found: ID does not exist" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.609030 4824 scope.go:117] "RemoveContainer" containerID="c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0" Feb 24 00:11:03 crc kubenswrapper[4824]: E0224 00:11:03.609596 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0\": container with ID starting with c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0 not found: ID does not exist" containerID="c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.609651 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0"} err="failed to get container status \"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0\": rpc error: code = NotFound desc = could not find container \"c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0\": container with ID starting with c12d91234669c462b12713951c50eb0c0d50df4a3939e1596a445f3e78f2d6f0 not found: ID does not exist" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.609686 4824 scope.go:117] "RemoveContainer" containerID="095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533" Feb 24 00:11:03 crc kubenswrapper[4824]: E0224 00:11:03.610040 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533\": container with ID starting with 095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533 not found: ID does not exist" containerID="095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533" Feb 24 00:11:03 crc kubenswrapper[4824]: I0224 00:11:03.610064 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533"} err="failed to get container status \"095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533\": rpc error: code = NotFound desc = could not find container \"095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533\": container with ID starting with 095e0801ed7074f62f4831cd0263857d8e7741a2b32dd7eae5398cdc3839e533 not found: ID does not exist" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.210664 4824 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.210985 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="extract-utilities" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211007 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="extract-utilities" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.211018 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211027 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.211035 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="extract-content" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211042 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="extract-content" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.211060 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="extract-content" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211067 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="extract-content" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.211075 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211081 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.211092 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="extract-utilities" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211100 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="extract-utilities" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211229 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211247 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00860ed-9085-40bb-9041-16eac6d88fb1" containerName="registry-server" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211729 4824 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.211889 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212233 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922" gracePeriod=15 Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212286 4824 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212315 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2" gracePeriod=15 Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212328 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99" gracePeriod=15 Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212429 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc" gracePeriod=15 Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212452 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212464 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212474 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212481 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212488 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212494 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212505 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212528 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212419 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb" gracePeriod=15 Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212537 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212645 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212676 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212682 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212701 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212708 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.212722 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212727 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212920 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212928 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212942 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212953 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212963 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212973 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.212979 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.213074 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.213083 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.213092 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.213100 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.213242 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.213534 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.251913 4824 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300012 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300096 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300148 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300289 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300344 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300669 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300691 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.300765 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.402183 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.402921 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403002 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403055 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403021 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403085 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403176 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403219 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.402590 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403274 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403304 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403467 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403568 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403638 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403783 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.403834 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.541612 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.542683 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.543231 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb" exitCode=2 Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.552965 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:04 crc kubenswrapper[4824]: E0224 00:11:04.597597 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897064351af0018 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:11:04.596578328 +0000 UTC m=+328.586202797,LastTimestamp:2026-02-24 00:11:04.596578328 +0000 UTC m=+328.586202797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:11:04 crc kubenswrapper[4824]: I0224 00:11:04.701091 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08de7fe0-2d54-408b-8e09-3e1b9bcf931a" path="/var/lib/kubelet/pods/08de7fe0-2d54-408b-8e09-3e1b9bcf931a/volumes" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.179328 4824 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.179833 4824 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.180187 4824 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.180509 4824 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.181018 4824 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.181256 4824 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.181590 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.382576 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.552283 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.554890 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.557834 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99" exitCode=0 Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.557883 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2" exitCode=0 Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.557900 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc" exitCode=0 Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.557948 4824 scope.go:117] "RemoveContainer" containerID="b537bda4e4e7c025e1c81ad716f22312ab00e988cca42f2bed056b96ed0a594a" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.560707 4824 generic.go:334] "Generic (PLEG): container finished" podID="466928f3-88e1-4111-8358-13db2bd5ba58" containerID="713672e1f8b7451df74461c3a60e57bbab1ffd950fc9f64d1a805ac3787f3127" exitCode=0 Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.560828 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"466928f3-88e1-4111-8358-13db2bd5ba58","Type":"ContainerDied","Data":"713672e1f8b7451df74461c3a60e57bbab1ffd950fc9f64d1a805ac3787f3127"} Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.561622 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:05 crc kubenswrapper[4824]: I0224 00:11:05.564028 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"552a62587bcd4bb19b0331dbaea1c6531dfe36fcaa945b33259b7c1bdcace817"} Feb 24 00:11:05 crc kubenswrapper[4824]: E0224 00:11:05.784354 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Feb 24 00:11:06 crc kubenswrapper[4824]: I0224 00:11:06.572149 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7"} Feb 24 00:11:06 crc kubenswrapper[4824]: E0224 00:11:06.585086 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Feb 24 00:11:06 crc kubenswrapper[4824]: I0224 00:11:06.697877 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:06 crc kubenswrapper[4824]: I0224 00:11:06.910103 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:11:06 crc kubenswrapper[4824]: I0224 00:11:06.910752 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.042808 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir\") pod \"466928f3-88e1-4111-8358-13db2bd5ba58\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.042914 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "466928f3-88e1-4111-8358-13db2bd5ba58" (UID: "466928f3-88e1-4111-8358-13db2bd5ba58"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.042974 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access\") pod \"466928f3-88e1-4111-8358-13db2bd5ba58\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.043004 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock\") pod \"466928f3-88e1-4111-8358-13db2bd5ba58\" (UID: \"466928f3-88e1-4111-8358-13db2bd5ba58\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.043193 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock" (OuterVolumeSpecName: "var-lock") pod "466928f3-88e1-4111-8358-13db2bd5ba58" (UID: "466928f3-88e1-4111-8358-13db2bd5ba58"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.043364 4824 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.043392 4824 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/466928f3-88e1-4111-8358-13db2bd5ba58-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.078959 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "466928f3-88e1-4111-8358-13db2bd5ba58" (UID: "466928f3-88e1-4111-8358-13db2bd5ba58"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.144174 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/466928f3-88e1-4111-8358-13db2bd5ba58-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.590228 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.591023 4824 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922" exitCode=0 Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.593397 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"466928f3-88e1-4111-8358-13db2bd5ba58","Type":"ContainerDied","Data":"3e14bb11973a03d1681cf1c9d6b14d165f03b40a1a2b66d541ff08ad0753d14f"} Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.593460 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e14bb11973a03d1681cf1c9d6b14d165f03b40a1a2b66d541ff08ad0753d14f" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.593858 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.593891 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:07 crc kubenswrapper[4824]: E0224 00:11:07.594093 4824 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.606793 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.776127 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.776944 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.778169 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.778717 4824 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853059 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853177 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853192 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853245 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853337 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853449 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853807 4824 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853829 4824 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:07 crc kubenswrapper[4824]: I0224 00:11:07.853841 4824 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:08 crc kubenswrapper[4824]: E0224 00:11:08.186610 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Feb 24 00:11:08 crc kubenswrapper[4824]: E0224 00:11:08.512563 4824 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897064351af0018 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 00:11:04.596578328 +0000 UTC m=+328.586202797,LastTimestamp:2026-02-24 00:11:04.596578328 +0000 UTC m=+328.586202797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.606315 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.608000 4824 scope.go:117] "RemoveContainer" containerID="db87a346b00854fb6769899a263eb5f7fad8cfa115365e2197c0e02f7285ff99" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.608080 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.622361 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.623248 4824 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.627304 4824 scope.go:117] "RemoveContainer" containerID="81bb21556a01a403315d2c9623aa07ab39ac4a789433423a89eda1da3547b6d2" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.642885 4824 scope.go:117] "RemoveContainer" containerID="6dd9b54ae9c9e895db2191be45a25d352d8c0617350519ad21f6ed3f2af941fc" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.655640 4824 scope.go:117] "RemoveContainer" containerID="bf3a76ba41dba79001ebad9083600800b0f094accb5dc60c2a364aa001909feb" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.672117 4824 scope.go:117] "RemoveContainer" containerID="09c8ceeca4d71401bcd6ae7d050ac58f840a0cc91f699f91f799930c896be922" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.686316 4824 scope.go:117] "RemoveContainer" containerID="5fec537d7330c01eaddfb379d343ead1a363fd5b2894edf23dcd5c6218a0302a" Feb 24 00:11:08 crc kubenswrapper[4824]: I0224 00:11:08.702759 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 24 00:11:09 crc kubenswrapper[4824]: E0224 00:11:09.729979 4824 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" volumeName="registry-storage" Feb 24 00:11:10 crc kubenswrapper[4824]: I0224 00:11:10.800749 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:11:10 crc kubenswrapper[4824]: I0224 00:11:10.800940 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:11:10 crc kubenswrapper[4824]: W0224 00:11:10.802942 4824 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:10 crc kubenswrapper[4824]: E0224 00:11:10.803667 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:10 crc kubenswrapper[4824]: I0224 00:11:10.804080 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:11:10 crc kubenswrapper[4824]: I0224 00:11:10.804918 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:11:10 crc kubenswrapper[4824]: W0224 00:11:10.804983 4824 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27341": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:10 crc kubenswrapper[4824]: E0224 00:11:10.805760 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27341\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:10 crc kubenswrapper[4824]: W0224 00:11:10.805752 4824 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:10 crc kubenswrapper[4824]: E0224 00:11:10.805900 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.388057 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.801868 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.802240 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:11 crc kubenswrapper[4824]: W0224 00:11:11.802655 4824 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.802742 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.805192 4824 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.805224 4824 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.805331 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:13:13.8052619 +0000 UTC m=+457.794886369 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Feb 24 00:11:11 crc kubenswrapper[4824]: E0224 00:11:11.805362 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 00:13:13.805350132 +0000 UTC m=+457.794974601 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803146 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803450 4824 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803191 4824 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803493 4824 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803537 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 00:13:14.803495712 +0000 UTC m=+458.793120181 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.803576 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 00:13:14.803556804 +0000 UTC m=+458.793181293 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Feb 24 00:11:12 crc kubenswrapper[4824]: W0224 00:11:12.970927 4824 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27341": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:12 crc kubenswrapper[4824]: E0224 00:11:12.971714 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27341\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:13 crc kubenswrapper[4824]: W0224 00:11:13.230362 4824 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:13 crc kubenswrapper[4824]: E0224 00:11:13.230493 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:13 crc kubenswrapper[4824]: W0224 00:11:13.725456 4824 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:13 crc kubenswrapper[4824]: E0224 00:11:13.725611 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:14 crc kubenswrapper[4824]: W0224 00:11:14.530891 4824 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:14 crc kubenswrapper[4824]: E0224 00:11:14.531038 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.699309 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.700665 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.701232 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.722275 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.722692 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:16 crc kubenswrapper[4824]: E0224 00:11:16.723425 4824 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:16 crc kubenswrapper[4824]: I0224 00:11:16.724091 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.675029 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.677182 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.677258 4824 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b" exitCode=1 Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.677354 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b"} Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.678141 4824 scope.go:117] "RemoveContainer" containerID="45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.678470 4824 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.679584 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.680821 4824 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="88e1482fbfe26a9ddc7dd159c8b550ccebd195491516483ad9ac1b92c7444bc3" exitCode=0 Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.680867 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"88e1482fbfe26a9ddc7dd159c8b550ccebd195491516483ad9ac1b92c7444bc3"} Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.680942 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9bffb7df3529fff53c3ecfb1a3689d2a402f10e28e581ffcfa2f90e68fcb4cf4"} Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.681614 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.681658 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.682195 4824 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:17 crc kubenswrapper[4824]: E0224 00:11:17.682330 4824 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:17 crc kubenswrapper[4824]: I0224 00:11:17.682692 4824 status_manager.go:851] "Failed to get status for pod" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 24 00:11:17 crc kubenswrapper[4824]: E0224 00:11:17.789402 4824 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="7s" Feb 24 00:11:17 crc kubenswrapper[4824]: W0224 00:11:17.908757 4824 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342": dial tcp 38.102.83.151:6443: connect: connection refused Feb 24 00:11:17 crc kubenswrapper[4824]: E0224 00:11:17.908849 4824 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27342\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.033954 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.689733 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.691808 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.691906 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34eb8fb83da5aca983d6da868242cce539ecbefeda8efd3d70063bb191fa81ec"} Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.703075 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2be20a298d84dcd9e3ae2da09fa4ae09872abedbc6a4cc9dd61ad0b7d4398737"} Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.703131 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"787b3d96926e01383a9793acd1ef1b226f2a21553ca02562140611f205b2b7ec"} Feb 24 00:11:18 crc kubenswrapper[4824]: I0224 00:11:18.703151 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2edaa15be988babb6641b30c4b06a22b37ddfe81a7e261de5cabd621e2676659"} Feb 24 00:11:19 crc kubenswrapper[4824]: I0224 00:11:19.712750 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"47837f319ab77f6353d3859929f2dcc0baf720494b33907d4bfb682b813c7a3a"} Feb 24 00:11:19 crc kubenswrapper[4824]: I0224 00:11:19.712808 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e2a31b42788232608657e2d119d71b3e07fa9823ac32076ef7f5b0fcc0ea205b"} Feb 24 00:11:19 crc kubenswrapper[4824]: I0224 00:11:19.713061 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:19 crc kubenswrapper[4824]: I0224 00:11:19.713078 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:20 crc kubenswrapper[4824]: E0224 00:11:20.710183 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 00:11:21 crc kubenswrapper[4824]: E0224 00:11:21.717792 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 00:11:21 crc kubenswrapper[4824]: I0224 00:11:21.724346 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:21 crc kubenswrapper[4824]: I0224 00:11:21.724390 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:21 crc kubenswrapper[4824]: I0224 00:11:21.730134 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:21 crc kubenswrapper[4824]: E0224 00:11:21.748821 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 00:11:24 crc kubenswrapper[4824]: I0224 00:11:24.445822 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 00:11:24 crc kubenswrapper[4824]: I0224 00:11:24.703787 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 00:11:24 crc kubenswrapper[4824]: I0224 00:11:24.723144 4824 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:24 crc kubenswrapper[4824]: I0224 00:11:24.764131 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 00:11:25 crc kubenswrapper[4824]: I0224 00:11:25.744573 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:25 crc kubenswrapper[4824]: I0224 00:11:25.744695 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:25 crc kubenswrapper[4824]: I0224 00:11:25.744739 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:25 crc kubenswrapper[4824]: I0224 00:11:25.751946 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.533705 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.708582 4824 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8f81cf07-5ee1-44c0-bd9f-eda3410b1085" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.749714 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.749745 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.753128 4824 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8f81cf07-5ee1-44c0-bd9f-eda3410b1085" Feb 24 00:11:26 crc kubenswrapper[4824]: I0224 00:11:26.978012 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 00:11:27 crc kubenswrapper[4824]: I0224 00:11:27.756458 4824 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:27 crc kubenswrapper[4824]: I0224 00:11:27.756502 4824 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5209eb40-b077-42f2-9239-27bf1cde0e05" Feb 24 00:11:27 crc kubenswrapper[4824]: I0224 00:11:27.760928 4824 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8f81cf07-5ee1-44c0-bd9f-eda3410b1085" Feb 24 00:11:28 crc kubenswrapper[4824]: I0224 00:11:28.030021 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:11:28 crc kubenswrapper[4824]: I0224 00:11:28.030672 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 00:11:28 crc kubenswrapper[4824]: I0224 00:11:28.030810 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 00:11:32 crc kubenswrapper[4824]: I0224 00:11:32.693892 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:11:33 crc kubenswrapper[4824]: I0224 00:11:33.693188 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:11:34 crc kubenswrapper[4824]: I0224 00:11:34.841852 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 00:11:35 crc kubenswrapper[4824]: I0224 00:11:35.183024 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 00:11:35 crc kubenswrapper[4824]: I0224 00:11:35.397564 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 00:11:35 crc kubenswrapper[4824]: I0224 00:11:35.591198 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 00:11:35 crc kubenswrapper[4824]: I0224 00:11:35.617861 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.324886 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.355323 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.577705 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.693821 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.765576 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.849819 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 00:11:36 crc kubenswrapper[4824]: I0224 00:11:36.887997 4824 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 00:11:37 crc kubenswrapper[4824]: I0224 00:11:37.004464 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 00:11:37 crc kubenswrapper[4824]: I0224 00:11:37.197489 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 00:11:37 crc kubenswrapper[4824]: I0224 00:11:37.329526 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 00:11:37 crc kubenswrapper[4824]: I0224 00:11:37.367812 4824 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 00:11:37 crc kubenswrapper[4824]: I0224 00:11:37.693425 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.031246 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.031352 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.109187 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.118042 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.221305 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.224692 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.228853 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.232025 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.238202 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.354579 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.380478 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.384372 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.442711 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.444317 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.543460 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.730461 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.860399 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.871736 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 00:11:38 crc kubenswrapper[4824]: I0224 00:11:38.899784 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.023592 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.028370 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.131235 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.148623 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.249575 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.264593 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.469362 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.503482 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.548248 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.552279 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.576239 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.580861 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.610268 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.660455 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.688441 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.791567 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 00:11:39 crc kubenswrapper[4824]: I0224 00:11:39.905298 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.009775 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.405204 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.447126 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.515829 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.579938 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.642339 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.660935 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.720300 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.751679 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.760534 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.809174 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.811335 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.882051 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 00:11:40 crc kubenswrapper[4824]: I0224 00:11:40.893466 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.014367 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.068977 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.085975 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.157331 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.223824 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.225542 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.280685 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.330075 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.372359 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.391101 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.454259 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.538210 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.558046 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.633422 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.643242 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.665817 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.715634 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.806458 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 00:11:41 crc kubenswrapper[4824]: I0224 00:11:41.923832 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.032550 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.037667 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.153313 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.244213 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.274779 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.289064 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.345228 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.347776 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.398248 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.417530 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.502385 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.518812 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.573948 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.580418 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.650952 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.686433 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.915672 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 00:11:42 crc kubenswrapper[4824]: I0224 00:11:42.992823 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.019935 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.030139 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.113197 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.120055 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.121186 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.297044 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.308283 4824 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.311968 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.312012 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.316790 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.338387 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.341982 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.341962438 podStartE2EDuration="19.341962438s" podCreationTimestamp="2026-02-24 00:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:11:43.338855873 +0000 UTC m=+367.328480372" watchObservedRunningTime="2026-02-24 00:11:43.341962438 +0000 UTC m=+367.331586907" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.346072 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.378958 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.387537 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.460808 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.479425 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.639656 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.701756 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.768997 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.818704 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.866316 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 00:11:43 crc kubenswrapper[4824]: I0224 00:11:43.994217 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.028746 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.051858 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.073545 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.073816 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.108989 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.167277 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.187367 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.240375 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.259129 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.259479 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.260096 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.305852 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.436103 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.496014 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.533598 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.586284 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.603628 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.604113 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.825557 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.843413 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.868335 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.882361 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.952488 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 00:11:44 crc kubenswrapper[4824]: I0224 00:11:44.970699 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.033599 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.039018 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.088340 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.104794 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.241882 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.258573 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.292256 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.521418 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.602990 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.671959 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.717383 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.730107 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.736508 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.744449 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.785486 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.829062 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 00:11:45 crc kubenswrapper[4824]: I0224 00:11:45.904507 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.011639 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.081865 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.108744 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.143646 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.146743 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.167263 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.272782 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.294675 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.340411 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.442628 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.465248 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.557958 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.590863 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.683495 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.689104 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.699025 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.776836 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.840593 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.875668 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 00:11:46 crc kubenswrapper[4824]: I0224 00:11:46.943830 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.105878 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.165745 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.195032 4824 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.195401 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7" gracePeriod=5 Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.197111 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.293683 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.301936 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.329016 4824 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.374149 4824 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.420998 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.433732 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.518161 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.586403 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.626211 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.665137 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.673080 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.692321 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.821679 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.842709 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 00:11:47 crc kubenswrapper[4824]: I0224 00:11:47.919924 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.030963 4824 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.031033 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.031102 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.033055 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"34eb8fb83da5aca983d6da868242cce539ecbefeda8efd3d70063bb191fa81ec"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.033269 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://34eb8fb83da5aca983d6da868242cce539ecbefeda8efd3d70063bb191fa81ec" gracePeriod=30 Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.094865 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.127882 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.128559 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.198643 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.211565 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.221937 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.231033 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.241783 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.360124 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.496589 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.608226 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.618399 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.713618 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.838038 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.869109 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 00:11:48 crc kubenswrapper[4824]: I0224 00:11:48.922738 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.025490 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.033156 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.109640 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.176910 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.239150 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.241016 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.333125 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.376919 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.390630 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.487927 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.548761 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.567457 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.607552 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.662922 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.663557 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.688101 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.760701 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 00:11:49 crc kubenswrapper[4824]: I0224 00:11:49.870212 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.125048 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.134970 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.150236 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.290686 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.353561 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.353944 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dmjz7" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="registry-server" containerID="cri-o://ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.372402 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.373225 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhftg" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="registry-server" containerID="cri-o://83d0a00bbb287f8c717cb0e93e56c8a769b62fbe8a1114585fcf0819cddb1d85" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.391393 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.391698 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" containerID="cri-o://1f7f84523e39d2e74db2895c5b1819295512a987f6083e74a45c4c25f78e706d" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.391861 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.404632 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.404714 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.404982 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gl27t" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="registry-server" containerID="cri-o://f3129bb41cd26ff02fd1b16661272cba00c8572d524dd0295795e4e681de10f0" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.405681 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nzqwf" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="registry-server" containerID="cri-o://d8cb34947ec733a964cff732c9bb70c2d8c98ea3a605270b5ec9f8c81b631a37" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.407760 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.408078 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxplg" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="registry-server" containerID="cri-o://da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" gracePeriod=30 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.469026 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.605569 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.677264 4824 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 00:11:50 crc kubenswrapper[4824]: E0224 00:11:50.820590 4824 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518 is running failed: container process not found" containerID="da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 00:11:50 crc kubenswrapper[4824]: E0224 00:11:50.822073 4824 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518 is running failed: container process not found" containerID="da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 00:11:50 crc kubenswrapper[4824]: E0224 00:11:50.822502 4824 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518 is running failed: container process not found" containerID="da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 00:11:50 crc kubenswrapper[4824]: E0224 00:11:50.822614 4824 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zxplg" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="registry-server" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.861002 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.908319 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.917180 4824 generic.go:334] "Generic (PLEG): container finished" podID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerID="83d0a00bbb287f8c717cb0e93e56c8a769b62fbe8a1114585fcf0819cddb1d85" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.917267 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerDied","Data":"83d0a00bbb287f8c717cb0e93e56c8a769b62fbe8a1114585fcf0819cddb1d85"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.919672 4824 generic.go:334] "Generic (PLEG): container finished" podID="2da73289-3f96-4828-a106-46c3b0469e7d" containerID="f3129bb41cd26ff02fd1b16661272cba00c8572d524dd0295795e4e681de10f0" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.935457 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerDied","Data":"f3129bb41cd26ff02fd1b16661272cba00c8572d524dd0295795e4e681de10f0"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.948115 4824 generic.go:334] "Generic (PLEG): container finished" podID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerID="da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.948265 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerDied","Data":"da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.970838 4824 generic.go:334] "Generic (PLEG): container finished" podID="b142d96b-87c3-444b-b135-fdddaa658234" containerID="d8cb34947ec733a964cff732c9bb70c2d8c98ea3a605270b5ec9f8c81b631a37" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.970939 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerDied","Data":"d8cb34947ec733a964cff732c9bb70c2d8c98ea3a605270b5ec9f8c81b631a37"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.977285 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxnlm\" (UniqueName: \"kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm\") pod \"cc119514-5c95-4925-8a1a-3e6844a34e1e\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.977388 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities\") pod \"cc119514-5c95-4925-8a1a-3e6844a34e1e\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.977487 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content\") pod \"cc119514-5c95-4925-8a1a-3e6844a34e1e\" (UID: \"cc119514-5c95-4925-8a1a-3e6844a34e1e\") " Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.982811 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities" (OuterVolumeSpecName: "utilities") pod "cc119514-5c95-4925-8a1a-3e6844a34e1e" (UID: "cc119514-5c95-4925-8a1a-3e6844a34e1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.987146 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm" (OuterVolumeSpecName: "kube-api-access-bxnlm") pod "cc119514-5c95-4925-8a1a-3e6844a34e1e" (UID: "cc119514-5c95-4925-8a1a-3e6844a34e1e"). InnerVolumeSpecName "kube-api-access-bxnlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.988894 4824 generic.go:334] "Generic (PLEG): container finished" podID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerID="1f7f84523e39d2e74db2895c5b1819295512a987f6083e74a45c4c25f78e706d" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.988998 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" event={"ID":"e312a49f-dc7a-49fc-9baf-3105fec587ae","Type":"ContainerDied","Data":"1f7f84523e39d2e74db2895c5b1819295512a987f6083e74a45c4c25f78e706d"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.994158 4824 generic.go:334] "Generic (PLEG): container finished" podID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerID="ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971" exitCode=0 Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.994201 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerDied","Data":"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.994226 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmjz7" event={"ID":"cc119514-5c95-4925-8a1a-3e6844a34e1e","Type":"ContainerDied","Data":"42b124ba705dc951f666837537c3a14e76c91f608879722e252c98578703a4ac"} Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.994246 4824 scope.go:117] "RemoveContainer" containerID="ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971" Feb 24 00:11:50 crc kubenswrapper[4824]: I0224 00:11:50.994419 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmjz7" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.000830 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.003957 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.018048 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.021074 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.028846 4824 scope.go:117] "RemoveContainer" containerID="49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.046036 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.046369 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc119514-5c95-4925-8a1a-3e6844a34e1e" (UID: "cc119514-5c95-4925-8a1a-3e6844a34e1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.059169 4824 scope.go:117] "RemoveContainer" containerID="0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.078333 4824 scope.go:117] "RemoveContainer" containerID="ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971" Feb 24 00:11:51 crc kubenswrapper[4824]: E0224 00:11:51.078898 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971\": container with ID starting with ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971 not found: ID does not exist" containerID="ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.078958 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971"} err="failed to get container status \"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971\": rpc error: code = NotFound desc = could not find container \"ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971\": container with ID starting with ace099e248827a099f2f9964ea688049a17101eb26b429197dc321a43c82c971 not found: ID does not exist" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079080 4824 scope.go:117] "RemoveContainer" containerID="49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079704 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f76k\" (UniqueName: \"kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k\") pod \"e312a49f-dc7a-49fc-9baf-3105fec587ae\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079854 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content\") pod \"3e306ddf-071d-47f2-b9b1-bf772963438e\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " Feb 24 00:11:51 crc kubenswrapper[4824]: E0224 00:11:51.079881 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f\": container with ID starting with 49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f not found: ID does not exist" containerID="49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079911 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f"} err="failed to get container status \"49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f\": rpc error: code = NotFound desc = could not find container \"49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f\": container with ID starting with 49dee3eafa362f85cb1a0da9a797f4ffd8f10b2e2e1b307ecd8002d23b24ca9f not found: ID does not exist" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079930 4824 scope.go:117] "RemoveContainer" containerID="0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.079932 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca\") pod \"e312a49f-dc7a-49fc-9baf-3105fec587ae\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080036 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics\") pod \"e312a49f-dc7a-49fc-9baf-3105fec587ae\" (UID: \"e312a49f-dc7a-49fc-9baf-3105fec587ae\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080117 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities\") pod \"3e306ddf-071d-47f2-b9b1-bf772963438e\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080169 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndbcf\" (UniqueName: \"kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf\") pod \"3e306ddf-071d-47f2-b9b1-bf772963438e\" (UID: \"3e306ddf-071d-47f2-b9b1-bf772963438e\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080763 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e312a49f-dc7a-49fc-9baf-3105fec587ae" (UID: "e312a49f-dc7a-49fc-9baf-3105fec587ae"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080887 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080918 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxnlm\" (UniqueName: \"kubernetes.io/projected/cc119514-5c95-4925-8a1a-3e6844a34e1e-kube-api-access-bxnlm\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080934 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc119514-5c95-4925-8a1a-3e6844a34e1e-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.080946 4824 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.081961 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities" (OuterVolumeSpecName: "utilities") pod "3e306ddf-071d-47f2-b9b1-bf772963438e" (UID: "3e306ddf-071d-47f2-b9b1-bf772963438e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: E0224 00:11:51.082614 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc\": container with ID starting with 0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc not found: ID does not exist" containerID="0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.082703 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc"} err="failed to get container status \"0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc\": rpc error: code = NotFound desc = could not find container \"0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc\": container with ID starting with 0a0ed56ec1c248b7e9a29be5b115fbce61e3aaf9d9d45ae0ef4e45bef08de9cc not found: ID does not exist" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.083555 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k" (OuterVolumeSpecName: "kube-api-access-6f76k") pod "e312a49f-dc7a-49fc-9baf-3105fec587ae" (UID: "e312a49f-dc7a-49fc-9baf-3105fec587ae"). InnerVolumeSpecName "kube-api-access-6f76k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.084760 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e312a49f-dc7a-49fc-9baf-3105fec587ae" (UID: "e312a49f-dc7a-49fc-9baf-3105fec587ae"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.086150 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf" (OuterVolumeSpecName: "kube-api-access-ndbcf") pod "3e306ddf-071d-47f2-b9b1-bf772963438e" (UID: "3e306ddf-071d-47f2-b9b1-bf772963438e"). InnerVolumeSpecName "kube-api-access-ndbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.118472 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.135452 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.154340 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e306ddf-071d-47f2-b9b1-bf772963438e" (UID: "3e306ddf-071d-47f2-b9b1-bf772963438e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181664 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc6ds\" (UniqueName: \"kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds\") pod \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181716 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hggqz\" (UniqueName: \"kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz\") pod \"b142d96b-87c3-444b-b135-fdddaa658234\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181788 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content\") pod \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181835 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fhvh\" (UniqueName: \"kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh\") pod \"2da73289-3f96-4828-a106-46c3b0469e7d\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181861 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities\") pod \"b142d96b-87c3-444b-b135-fdddaa658234\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181883 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities\") pod \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\" (UID: \"7a78c7d6-6ec6-4857-af87-25c5c8cf961d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181918 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities\") pod \"2da73289-3f96-4828-a106-46c3b0469e7d\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181936 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content\") pod \"2da73289-3f96-4828-a106-46c3b0469e7d\" (UID: \"2da73289-3f96-4828-a106-46c3b0469e7d\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.181970 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content\") pod \"b142d96b-87c3-444b-b135-fdddaa658234\" (UID: \"b142d96b-87c3-444b-b135-fdddaa658234\") " Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.182220 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.182232 4824 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e312a49f-dc7a-49fc-9baf-3105fec587ae-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.182242 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e306ddf-071d-47f2-b9b1-bf772963438e-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.182252 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndbcf\" (UniqueName: \"kubernetes.io/projected/3e306ddf-071d-47f2-b9b1-bf772963438e-kube-api-access-ndbcf\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.182263 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f76k\" (UniqueName: \"kubernetes.io/projected/e312a49f-dc7a-49fc-9baf-3105fec587ae-kube-api-access-6f76k\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.183195 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities" (OuterVolumeSpecName: "utilities") pod "b142d96b-87c3-444b-b135-fdddaa658234" (UID: "b142d96b-87c3-444b-b135-fdddaa658234"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.183408 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities" (OuterVolumeSpecName: "utilities") pod "7a78c7d6-6ec6-4857-af87-25c5c8cf961d" (UID: "7a78c7d6-6ec6-4857-af87-25c5c8cf961d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.183478 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities" (OuterVolumeSpecName: "utilities") pod "2da73289-3f96-4828-a106-46c3b0469e7d" (UID: "2da73289-3f96-4828-a106-46c3b0469e7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.185315 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz" (OuterVolumeSpecName: "kube-api-access-hggqz") pod "b142d96b-87c3-444b-b135-fdddaa658234" (UID: "b142d96b-87c3-444b-b135-fdddaa658234"). InnerVolumeSpecName "kube-api-access-hggqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.185479 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh" (OuterVolumeSpecName: "kube-api-access-7fhvh") pod "2da73289-3f96-4828-a106-46c3b0469e7d" (UID: "2da73289-3f96-4828-a106-46c3b0469e7d"). InnerVolumeSpecName "kube-api-access-7fhvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.196287 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds" (OuterVolumeSpecName: "kube-api-access-zc6ds") pod "7a78c7d6-6ec6-4857-af87-25c5c8cf961d" (UID: "7a78c7d6-6ec6-4857-af87-25c5c8cf961d"). InnerVolumeSpecName "kube-api-access-zc6ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.205643 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b142d96b-87c3-444b-b135-fdddaa658234" (UID: "b142d96b-87c3-444b-b135-fdddaa658234"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283633 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fhvh\" (UniqueName: \"kubernetes.io/projected/2da73289-3f96-4828-a106-46c3b0469e7d-kube-api-access-7fhvh\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283683 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283695 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283703 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283712 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b142d96b-87c3-444b-b135-fdddaa658234-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283724 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hggqz\" (UniqueName: \"kubernetes.io/projected/b142d96b-87c3-444b-b135-fdddaa658234-kube-api-access-hggqz\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.283734 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc6ds\" (UniqueName: \"kubernetes.io/projected/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-kube-api-access-zc6ds\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.324157 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2da73289-3f96-4828-a106-46c3b0469e7d" (UID: "2da73289-3f96-4828-a106-46c3b0469e7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.334972 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.338256 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dmjz7"] Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.347483 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a78c7d6-6ec6-4857-af87-25c5c8cf961d" (UID: "7a78c7d6-6ec6-4857-af87-25c5c8cf961d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.384463 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a78c7d6-6ec6-4857-af87-25c5c8cf961d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:51 crc kubenswrapper[4824]: I0224 00:11:51.384838 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da73289-3f96-4828-a106-46c3b0469e7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.001503 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhftg" event={"ID":"3e306ddf-071d-47f2-b9b1-bf772963438e","Type":"ContainerDied","Data":"24c650baf1648fdbc140def26b06acbc896c72aa2095332a4a2cc286bdf3cc0c"} Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.001622 4824 scope.go:117] "RemoveContainer" containerID="83d0a00bbb287f8c717cb0e93e56c8a769b62fbe8a1114585fcf0819cddb1d85" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.001550 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhftg" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.003982 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gl27t" event={"ID":"2da73289-3f96-4828-a106-46c3b0469e7d","Type":"ContainerDied","Data":"2503a134b22274bc6e70e9fb4c998a82c8a291a8ce5041c5a448cbf0b7c362a7"} Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.003994 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gl27t" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.007052 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxplg" event={"ID":"7a78c7d6-6ec6-4857-af87-25c5c8cf961d","Type":"ContainerDied","Data":"6ef8798cf5f3aadb98a5ae1d2d3bf34bb35cf168ac8076ee6ba9bc741a06b98b"} Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.007124 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxplg" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.010179 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzqwf" event={"ID":"b142d96b-87c3-444b-b135-fdddaa658234","Type":"ContainerDied","Data":"6087d7cb108c4772f5476645e00887a465effb8e262d89a746313bbbb9fb34f8"} Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.010254 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzqwf" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.013495 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" event={"ID":"e312a49f-dc7a-49fc-9baf-3105fec587ae","Type":"ContainerDied","Data":"1e7b695fbb51788dd119d9e0ae76024be2038ddb563a00cf87c9d5c4544df61f"} Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.013596 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-99tkw" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.017242 4824 scope.go:117] "RemoveContainer" containerID="4336adaefce1f631229f06eda9fede5b34bd7e94028955471812962455639142" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.045888 4824 scope.go:117] "RemoveContainer" containerID="165f557a643df29a5f3055b0f6055d2350a6f07b3c59175faba79784672bcb83" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.053406 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.057842 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gl27t"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.073549 4824 scope.go:117] "RemoveContainer" containerID="f3129bb41cd26ff02fd1b16661272cba00c8572d524dd0295795e4e681de10f0" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.074738 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.078487 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxplg"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.085732 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.092300 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhftg"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.096790 4824 scope.go:117] "RemoveContainer" containerID="09e81517976ec38b505938bb2df2f3b6123c4b30e798621cb83825dcef2c35b1" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.102976 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.106611 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzqwf"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.112130 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.115348 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-99tkw"] Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.128593 4824 scope.go:117] "RemoveContainer" containerID="92570d872625fe189d1225ae3cfcceb0efc1931cef5c4ee603139bb405c9eff3" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.143322 4824 scope.go:117] "RemoveContainer" containerID="da0889e558f40879c1fabc025b056771135a794508cda04386b8dc975f871518" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.157374 4824 scope.go:117] "RemoveContainer" containerID="ae3090316a207f659563cb6daa67a1cc4d3c280950cb420d2cf6d0ddebc465d5" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.172872 4824 scope.go:117] "RemoveContainer" containerID="05173dd075227354e5c8172cf583a8c34fd894215338d07f6c1a9644348f85b0" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.183849 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.185756 4824 scope.go:117] "RemoveContainer" containerID="d8cb34947ec733a964cff732c9bb70c2d8c98ea3a605270b5ec9f8c81b631a37" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.201752 4824 scope.go:117] "RemoveContainer" containerID="8bd5382363dfe954b11d2958183ea67ba5ab63752a6364c784c6c9e09c7286e0" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.218878 4824 scope.go:117] "RemoveContainer" containerID="5af3da4115b49b00d3bb13283e7fccd617f9a8fbd1e5c6782e319a1b0a15e513" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.241896 4824 scope.go:117] "RemoveContainer" containerID="1f7f84523e39d2e74db2895c5b1819295512a987f6083e74a45c4c25f78e706d" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.369668 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.615026 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.701408 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" path="/var/lib/kubelet/pods/2da73289-3f96-4828-a106-46c3b0469e7d/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.702118 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" path="/var/lib/kubelet/pods/3e306ddf-071d-47f2-b9b1-bf772963438e/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.702961 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" path="/var/lib/kubelet/pods/7a78c7d6-6ec6-4857-af87-25c5c8cf961d/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.704264 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b142d96b-87c3-444b-b135-fdddaa658234" path="/var/lib/kubelet/pods/b142d96b-87c3-444b-b135-fdddaa658234/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.704991 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" path="/var/lib/kubelet/pods/cc119514-5c95-4925-8a1a-3e6844a34e1e/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.706252 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" path="/var/lib/kubelet/pods/e312a49f-dc7a-49fc-9baf-3105fec587ae/volumes" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.793141 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.793237 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.903990 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904098 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904256 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904286 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904327 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904374 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904503 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904589 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904650 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904897 4824 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904914 4824 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904925 4824 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.904937 4824 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:52 crc kubenswrapper[4824]: I0224 00:11:52.914428 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.006277 4824 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.028927 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.028991 4824 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7" exitCode=137 Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.029100 4824 scope.go:117] "RemoveContainer" containerID="1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.029348 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.049882 4824 scope.go:117] "RemoveContainer" containerID="1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7" Feb 24 00:11:53 crc kubenswrapper[4824]: E0224 00:11:53.050805 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7\": container with ID starting with 1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7 not found: ID does not exist" containerID="1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7" Feb 24 00:11:53 crc kubenswrapper[4824]: I0224 00:11:53.050873 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7"} err="failed to get container status \"1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7\": rpc error: code = NotFound desc = could not find container \"1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7\": container with ID starting with 1491c4b381f60602c7a24a8e5a735a39ea625073883a8aa7fd3568c56da3a7e7 not found: ID does not exist" Feb 24 00:11:54 crc kubenswrapper[4824]: I0224 00:11:54.700879 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.193912 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9t5cw"] Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194731 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194747 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194759 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194768 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194780 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" containerName="installer" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194790 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" containerName="installer" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194800 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194809 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194819 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194827 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194837 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194845 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194857 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194865 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194876 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194884 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194898 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194906 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194919 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194928 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194940 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194948 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194962 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194970 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.194981 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.194991 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.195001 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195009 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.195019 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195027 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.195039 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195047 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.195063 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195071 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="extract-utilities" Feb 24 00:12:16 crc kubenswrapper[4824]: E0224 00:12:16.195083 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195092 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="extract-content" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195230 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da73289-3f96-4828-a106-46c3b0469e7d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195244 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="e312a49f-dc7a-49fc-9baf-3105fec587ae" containerName="marketplace-operator" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195256 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e306ddf-071d-47f2-b9b1-bf772963438e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195269 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc119514-5c95-4925-8a1a-3e6844a34e1e" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195281 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="b142d96b-87c3-444b-b135-fdddaa658234" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195296 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195310 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="466928f3-88e1-4111-8358-13db2bd5ba58" containerName="installer" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.195323 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a78c7d6-6ec6-4857-af87-25c5c8cf961d" containerName="registry-server" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.197217 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.208852 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.209038 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.209169 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.221019 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-utilities\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.221115 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrvss\" (UniqueName: \"kubernetes.io/projected/9da3bd34-bc43-4c9d-a974-a131ad945913-kube-api-access-jrvss\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.221164 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-catalog-content\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.222436 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9t5cw"] Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.321822 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-catalog-content\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.321896 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-utilities\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.322439 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrvss\" (UniqueName: \"kubernetes.io/projected/9da3bd34-bc43-4c9d-a974-a131ad945913-kube-api-access-jrvss\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.322539 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-utilities\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.322894 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9da3bd34-bc43-4c9d-a974-a131ad945913-catalog-content\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.342567 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrvss\" (UniqueName: \"kubernetes.io/projected/9da3bd34-bc43-4c9d-a974-a131ad945913-kube-api-access-jrvss\") pod \"community-operators-9t5cw\" (UID: \"9da3bd34-bc43-4c9d-a974-a131ad945913\") " pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.534363 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:16 crc kubenswrapper[4824]: I0224 00:12:16.970466 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9t5cw"] Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.164303 4824 generic.go:334] "Generic (PLEG): container finished" podID="9da3bd34-bc43-4c9d-a974-a131ad945913" containerID="2ed26ef799edef0ffd0e95ed37ad0f017318b13d3836a24eab989c21d4c788fa" exitCode=0 Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.164396 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9t5cw" event={"ID":"9da3bd34-bc43-4c9d-a974-a131ad945913","Type":"ContainerDied","Data":"2ed26ef799edef0ffd0e95ed37ad0f017318b13d3836a24eab989c21d4c788fa"} Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.164457 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9t5cw" event={"ID":"9da3bd34-bc43-4c9d-a974-a131ad945913","Type":"ContainerStarted","Data":"8f3d32031602416c5c08e80ddfc202c716f95da638d1dfebd16062c3d0142dcc"} Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.583650 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9gq54"] Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.584619 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.587264 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.601840 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9gq54"] Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.640874 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxh7r\" (UniqueName: \"kubernetes.io/projected/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-kube-api-access-xxh7r\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.640926 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-catalog-content\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.641087 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-utilities\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.742237 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-utilities\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.742316 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxh7r\" (UniqueName: \"kubernetes.io/projected/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-kube-api-access-xxh7r\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.742380 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-catalog-content\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.742886 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-utilities\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.742966 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-catalog-content\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.767816 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxh7r\" (UniqueName: \"kubernetes.io/projected/2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d-kube-api-access-xxh7r\") pod \"certified-operators-9gq54\" (UID: \"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d\") " pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:17 crc kubenswrapper[4824]: I0224 00:12:17.900427 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.153688 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9gq54"] Feb 24 00:12:18 crc kubenswrapper[4824]: W0224 00:12:18.158193 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d7ceac8_1cca_49dc_bff6_f6fa38cbfc1d.slice/crio-5a7b263c271ecf10633b1c8eca972c691e190e8eeb3103a58ee1330d194d7bd0 WatchSource:0}: Error finding container 5a7b263c271ecf10633b1c8eca972c691e190e8eeb3103a58ee1330d194d7bd0: Status 404 returned error can't find the container with id 5a7b263c271ecf10633b1c8eca972c691e190e8eeb3103a58ee1330d194d7bd0 Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.172917 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.174386 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.176058 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.176134 4824 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="34eb8fb83da5aca983d6da868242cce539ecbefeda8efd3d70063bb191fa81ec" exitCode=137 Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.176221 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"34eb8fb83da5aca983d6da868242cce539ecbefeda8efd3d70063bb191fa81ec"} Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.176301 4824 scope.go:117] "RemoveContainer" containerID="45746da193a3aaa64e64067e2324afb4e4fb1a90a1357461ed97c0bbb5109f5b" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.178737 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gq54" event={"ID":"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d","Type":"ContainerStarted","Data":"5a7b263c271ecf10633b1c8eca972c691e190e8eeb3103a58ee1330d194d7bd0"} Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.183407 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9t5cw" event={"ID":"9da3bd34-bc43-4c9d-a974-a131ad945913","Type":"ContainerStarted","Data":"16fc88db0b88666a21a051566414e9cd9655444221b748206f54ed15178554e6"} Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.584948 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.585900 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.587597 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.595965 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.759778 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxsv\" (UniqueName: \"kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.759912 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.759935 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.861608 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.861674 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.861798 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxsv\" (UniqueName: \"kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.862636 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.862866 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.886033 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxsv\" (UniqueName: \"kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv\") pod \"redhat-marketplace-49mft\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:18 crc kubenswrapper[4824]: I0224 00:12:18.905838 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.191814 4824 generic.go:334] "Generic (PLEG): container finished" podID="2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d" containerID="7d1531cf7ef51dd43c129802fe3e10d3c81e1c31d5c3eee8dca4bb27d6f84300" exitCode=0 Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.191916 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gq54" event={"ID":"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d","Type":"ContainerDied","Data":"7d1531cf7ef51dd43c129802fe3e10d3c81e1c31d5c3eee8dca4bb27d6f84300"} Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.197445 4824 generic.go:334] "Generic (PLEG): container finished" podID="9da3bd34-bc43-4c9d-a974-a131ad945913" containerID="16fc88db0b88666a21a051566414e9cd9655444221b748206f54ed15178554e6" exitCode=0 Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.197557 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9t5cw" event={"ID":"9da3bd34-bc43-4c9d-a974-a131ad945913","Type":"ContainerDied","Data":"16fc88db0b88666a21a051566414e9cd9655444221b748206f54ed15178554e6"} Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.204122 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.205935 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.207009 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cd788104ee5073679e89d6716044c8f9b1bc2b1b2f1e8430a5c80eac94b1bc14"} Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.306809 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:12:19 crc kubenswrapper[4824]: W0224 00:12:19.318229 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda392c527_174d_4f66_a7cd_5f625192f3c7.slice/crio-c3b19531b333b7d03df785c5e3fd25d1eeba8ccb1da22f7e993d8082b132b9a9 WatchSource:0}: Error finding container c3b19531b333b7d03df785c5e3fd25d1eeba8ccb1da22f7e993d8082b132b9a9: Status 404 returned error can't find the container with id c3b19531b333b7d03df785c5e3fd25d1eeba8ccb1da22f7e993d8082b132b9a9 Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.988148 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2gh9t"] Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.990280 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:19 crc kubenswrapper[4824]: I0224 00:12:19.996957 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.003007 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gh9t"] Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.179678 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tlq\" (UniqueName: \"kubernetes.io/projected/ee751741-65c5-4db2-aa84-8c1e6868cf86-kube-api-access-s4tlq\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.179760 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-utilities\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.180124 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-catalog-content\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.215799 4824 generic.go:334] "Generic (PLEG): container finished" podID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerID="cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438" exitCode=0 Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.215880 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerDied","Data":"cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438"} Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.215913 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerStarted","Data":"c3b19531b333b7d03df785c5e3fd25d1eeba8ccb1da22f7e993d8082b132b9a9"} Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.220808 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gq54" event={"ID":"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d","Type":"ContainerStarted","Data":"436fe7ae7f38dd1b2015c03bc4e174f6d2505305fb8bdb910e42d905aca7ff72"} Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.226163 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9t5cw" event={"ID":"9da3bd34-bc43-4c9d-a974-a131ad945913","Type":"ContainerStarted","Data":"dd40460b2a99c7b75ae256225c8a47501b34e8656c2a46b2f3826d3108123382"} Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.281600 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-catalog-content\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.281667 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tlq\" (UniqueName: \"kubernetes.io/projected/ee751741-65c5-4db2-aa84-8c1e6868cf86-kube-api-access-s4tlq\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.282363 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-catalog-content\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.282606 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-utilities\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.283551 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee751741-65c5-4db2-aa84-8c1e6868cf86-utilities\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.305684 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tlq\" (UniqueName: \"kubernetes.io/projected/ee751741-65c5-4db2-aa84-8c1e6868cf86-kube-api-access-s4tlq\") pod \"redhat-operators-2gh9t\" (UID: \"ee751741-65c5-4db2-aa84-8c1e6868cf86\") " pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.319970 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.748971 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9t5cw" podStartSLOduration=2.22147007 podStartE2EDuration="4.748945513s" podCreationTimestamp="2026-02-24 00:12:16 +0000 UTC" firstStartedPulling="2026-02-24 00:12:17.166734143 +0000 UTC m=+401.156358612" lastFinishedPulling="2026-02-24 00:12:19.694209576 +0000 UTC m=+403.683834055" observedRunningTime="2026-02-24 00:12:20.284528746 +0000 UTC m=+404.274153215" watchObservedRunningTime="2026-02-24 00:12:20.748945513 +0000 UTC m=+404.738569992" Feb 24 00:12:20 crc kubenswrapper[4824]: I0224 00:12:20.754224 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gh9t"] Feb 24 00:12:20 crc kubenswrapper[4824]: W0224 00:12:20.761954 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee751741_65c5_4db2_aa84_8c1e6868cf86.slice/crio-02b0896a182546de17adf69ee63091996ac3b3d450e051ee0549762f4472ca94 WatchSource:0}: Error finding container 02b0896a182546de17adf69ee63091996ac3b3d450e051ee0549762f4472ca94: Status 404 returned error can't find the container with id 02b0896a182546de17adf69ee63091996ac3b3d450e051ee0549762f4472ca94 Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.233982 4824 generic.go:334] "Generic (PLEG): container finished" podID="ee751741-65c5-4db2-aa84-8c1e6868cf86" containerID="aebf29b53c0744e4fc62470b2b490adcbfe0e860d41469fd83ba66726889c35d" exitCode=0 Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.234176 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gh9t" event={"ID":"ee751741-65c5-4db2-aa84-8c1e6868cf86","Type":"ContainerDied","Data":"aebf29b53c0744e4fc62470b2b490adcbfe0e860d41469fd83ba66726889c35d"} Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.234400 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gh9t" event={"ID":"ee751741-65c5-4db2-aa84-8c1e6868cf86","Type":"ContainerStarted","Data":"02b0896a182546de17adf69ee63091996ac3b3d450e051ee0549762f4472ca94"} Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.240956 4824 generic.go:334] "Generic (PLEG): container finished" podID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerID="c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b" exitCode=0 Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.241033 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerDied","Data":"c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b"} Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.247732 4824 generic.go:334] "Generic (PLEG): container finished" podID="2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d" containerID="436fe7ae7f38dd1b2015c03bc4e174f6d2505305fb8bdb910e42d905aca7ff72" exitCode=0 Feb 24 00:12:21 crc kubenswrapper[4824]: I0224 00:12:21.248413 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gq54" event={"ID":"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d","Type":"ContainerDied","Data":"436fe7ae7f38dd1b2015c03bc4e174f6d2505305fb8bdb910e42d905aca7ff72"} Feb 24 00:12:22 crc kubenswrapper[4824]: I0224 00:12:22.254956 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9gq54" event={"ID":"2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d","Type":"ContainerStarted","Data":"af14323064ed760129bdf058136730f8060a37bfcd1a17a32c767285d91d8ee3"} Feb 24 00:12:22 crc kubenswrapper[4824]: I0224 00:12:22.258261 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gh9t" event={"ID":"ee751741-65c5-4db2-aa84-8c1e6868cf86","Type":"ContainerStarted","Data":"f9b584b25f5128765a5a2edfc6aa7e541ad6845c731eef3b059f02685564264b"} Feb 24 00:12:22 crc kubenswrapper[4824]: I0224 00:12:22.260531 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerStarted","Data":"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43"} Feb 24 00:12:22 crc kubenswrapper[4824]: I0224 00:12:22.293305 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9gq54" podStartSLOduration=2.640629552 podStartE2EDuration="5.293285697s" podCreationTimestamp="2026-02-24 00:12:17 +0000 UTC" firstStartedPulling="2026-02-24 00:12:19.195320016 +0000 UTC m=+403.184944485" lastFinishedPulling="2026-02-24 00:12:21.847976161 +0000 UTC m=+405.837600630" observedRunningTime="2026-02-24 00:12:22.277931387 +0000 UTC m=+406.267555866" watchObservedRunningTime="2026-02-24 00:12:22.293285697 +0000 UTC m=+406.282910156" Feb 24 00:12:22 crc kubenswrapper[4824]: I0224 00:12:22.306461 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-49mft" podStartSLOduration=2.628951762 podStartE2EDuration="4.306445086s" podCreationTimestamp="2026-02-24 00:12:18 +0000 UTC" firstStartedPulling="2026-02-24 00:12:20.218662785 +0000 UTC m=+404.208287254" lastFinishedPulling="2026-02-24 00:12:21.896156109 +0000 UTC m=+405.885780578" observedRunningTime="2026-02-24 00:12:22.291628601 +0000 UTC m=+406.281253080" watchObservedRunningTime="2026-02-24 00:12:22.306445086 +0000 UTC m=+406.296069555" Feb 24 00:12:23 crc kubenswrapper[4824]: I0224 00:12:23.268720 4824 generic.go:334] "Generic (PLEG): container finished" podID="ee751741-65c5-4db2-aa84-8c1e6868cf86" containerID="f9b584b25f5128765a5a2edfc6aa7e541ad6845c731eef3b059f02685564264b" exitCode=0 Feb 24 00:12:23 crc kubenswrapper[4824]: I0224 00:12:23.269655 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gh9t" event={"ID":"ee751741-65c5-4db2-aa84-8c1e6868cf86","Type":"ContainerDied","Data":"f9b584b25f5128765a5a2edfc6aa7e541ad6845c731eef3b059f02685564264b"} Feb 24 00:12:24 crc kubenswrapper[4824]: I0224 00:12:24.276843 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gh9t" event={"ID":"ee751741-65c5-4db2-aa84-8c1e6868cf86","Type":"ContainerStarted","Data":"db3d9cd416e0d097d4637e15ef0a6a67e6e68a2816d7eb41bb00fca055ead6a3"} Feb 24 00:12:24 crc kubenswrapper[4824]: I0224 00:12:24.302192 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2gh9t" podStartSLOduration=2.703407038 podStartE2EDuration="5.30217187s" podCreationTimestamp="2026-02-24 00:12:19 +0000 UTC" firstStartedPulling="2026-02-24 00:12:21.238805076 +0000 UTC m=+405.228429545" lastFinishedPulling="2026-02-24 00:12:23.837569908 +0000 UTC m=+407.827194377" observedRunningTime="2026-02-24 00:12:24.29890198 +0000 UTC m=+408.288526449" watchObservedRunningTime="2026-02-24 00:12:24.30217187 +0000 UTC m=+408.291796349" Feb 24 00:12:26 crc kubenswrapper[4824]: I0224 00:12:26.533753 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:12:26 crc kubenswrapper[4824]: I0224 00:12:26.534607 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:26 crc kubenswrapper[4824]: I0224 00:12:26.534984 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:26 crc kubenswrapper[4824]: I0224 00:12:26.578294 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:27 crc kubenswrapper[4824]: I0224 00:12:27.342664 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9t5cw" Feb 24 00:12:27 crc kubenswrapper[4824]: I0224 00:12:27.901303 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:27 crc kubenswrapper[4824]: I0224 00:12:27.901788 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:27 crc kubenswrapper[4824]: I0224 00:12:27.949875 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.030688 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.034476 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.307030 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.343097 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9gq54" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.906172 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.906236 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:28 crc kubenswrapper[4824]: I0224 00:12:28.944849 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:29 crc kubenswrapper[4824]: I0224 00:12:29.353683 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:12:30 crc kubenswrapper[4824]: I0224 00:12:30.320437 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:30 crc kubenswrapper[4824]: I0224 00:12:30.321015 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:30 crc kubenswrapper[4824]: I0224 00:12:30.358246 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:31 crc kubenswrapper[4824]: I0224 00:12:31.357406 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2gh9t" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.308109 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jqdmp"] Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.310543 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.314206 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.314471 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.323272 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jqdmp"] Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.332613 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.369316 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mdj24"] Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.370334 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.370999 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bxg2\" (UniqueName: \"kubernetes.io/projected/1c407e9b-e49e-46a5-8920-786aad1539fb-kube-api-access-2bxg2\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.371194 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.371428 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.442382 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mdj24"] Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472165 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-trusted-ca\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472236 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472265 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95bae32e-6c93-43ad-a262-14032654e69e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472303 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472328 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-registry-certificates\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472358 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bxg2\" (UniqueName: \"kubernetes.io/projected/1c407e9b-e49e-46a5-8920-786aad1539fb-kube-api-access-2bxg2\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472386 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7br\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-kube-api-access-lw7br\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472429 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-bound-sa-token\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472456 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95bae32e-6c93-43ad-a262-14032654e69e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472496 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.472618 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-registry-tls\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.474122 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.497859 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1c407e9b-e49e-46a5-8920-786aad1539fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.552558 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.556515 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bxg2\" (UniqueName: \"kubernetes.io/projected/1c407e9b-e49e-46a5-8920-786aad1539fb-kube-api-access-2bxg2\") pod \"marketplace-operator-79b997595-jqdmp\" (UID: \"1c407e9b-e49e-46a5-8920-786aad1539fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577020 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-registry-tls\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577064 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-trusted-ca\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577086 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95bae32e-6c93-43ad-a262-14032654e69e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577119 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-registry-certificates\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577143 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7br\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-kube-api-access-lw7br\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577174 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-bound-sa-token\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577191 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95bae32e-6c93-43ad-a262-14032654e69e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.577962 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/95bae32e-6c93-43ad-a262-14032654e69e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.579504 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-registry-certificates\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.579680 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95bae32e-6c93-43ad-a262-14032654e69e-trusted-ca\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.583567 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/95bae32e-6c93-43ad-a262-14032654e69e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.594491 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-registry-tls\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.615493 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-bound-sa-token\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.618964 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7br\" (UniqueName: \"kubernetes.io/projected/95bae32e-6c93-43ad-a262-14032654e69e-kube-api-access-lw7br\") pod \"image-registry-66df7c8f76-mdj24\" (UID: \"95bae32e-6c93-43ad-a262-14032654e69e\") " pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.629202 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:39 crc kubenswrapper[4824]: I0224 00:12:39.685195 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.082714 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jqdmp"] Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.144271 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mdj24"] Feb 24 00:12:40 crc kubenswrapper[4824]: W0224 00:12:40.149647 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95bae32e_6c93_43ad_a262_14032654e69e.slice/crio-ce2790e4ef255bf5d31045dcdfab8c1196c0c965ebb3b510526417d1480eca89 WatchSource:0}: Error finding container ce2790e4ef255bf5d31045dcdfab8c1196c0c965ebb3b510526417d1480eca89: Status 404 returned error can't find the container with id ce2790e4ef255bf5d31045dcdfab8c1196c0c965ebb3b510526417d1480eca89 Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.362479 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" event={"ID":"1c407e9b-e49e-46a5-8920-786aad1539fb","Type":"ContainerStarted","Data":"314dcd6c9f107655302da7241ac56ace2263b4bc7ed0de5a2b8a32c290fd9e2f"} Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.362578 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" event={"ID":"1c407e9b-e49e-46a5-8920-786aad1539fb","Type":"ContainerStarted","Data":"b74e58155541accb98ac9d70450b2a5cd4993f3029590721d053dbbd82db2a6d"} Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.366355 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" event={"ID":"95bae32e-6c93-43ad-a262-14032654e69e","Type":"ContainerStarted","Data":"0d00490f13978efe897a977f84d2f0a897e06b25125ef927ff90ac6ddb0fa1dc"} Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.366679 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.366793 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" event={"ID":"95bae32e-6c93-43ad-a262-14032654e69e","Type":"ContainerStarted","Data":"ce2790e4ef255bf5d31045dcdfab8c1196c0c965ebb3b510526417d1480eca89"} Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.386466 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" podStartSLOduration=1.386433242 podStartE2EDuration="1.386433242s" podCreationTimestamp="2026-02-24 00:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:12:40.380704165 +0000 UTC m=+424.370328654" watchObservedRunningTime="2026-02-24 00:12:40.386433242 +0000 UTC m=+424.376057711" Feb 24 00:12:40 crc kubenswrapper[4824]: I0224 00:12:40.410999 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" podStartSLOduration=1.410972053 podStartE2EDuration="1.410972053s" podCreationTimestamp="2026-02-24 00:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:12:40.408000681 +0000 UTC m=+424.397625170" watchObservedRunningTime="2026-02-24 00:12:40.410972053 +0000 UTC m=+424.400596522" Feb 24 00:12:41 crc kubenswrapper[4824]: I0224 00:12:41.372348 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:41 crc kubenswrapper[4824]: I0224 00:12:41.375190 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jqdmp" Feb 24 00:12:53 crc kubenswrapper[4824]: I0224 00:12:53.276266 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:12:53 crc kubenswrapper[4824]: I0224 00:12:53.276980 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:12:59 crc kubenswrapper[4824]: I0224 00:12:59.692025 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mdj24" Feb 24 00:12:59 crc kubenswrapper[4824]: I0224 00:12:59.822170 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:13:13 crc kubenswrapper[4824]: I0224 00:13:13.810739 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:13:13 crc kubenswrapper[4824]: I0224 00:13:13.811307 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:13:13 crc kubenswrapper[4824]: I0224 00:13:13.812730 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:13:13 crc kubenswrapper[4824]: I0224 00:13:13.817326 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:13:13 crc kubenswrapper[4824]: I0224 00:13:13.894941 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.576049 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ddd5396cef94d398a8f1a79fe5f2cb1b3279a90c8560d5f7dd6294be0bb10b17"} Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.576746 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"90738039084f55791a3d63e013cc7682f6c33620cbd2e333e8f898f890afb896"} Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.826080 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.826181 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.831484 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.834787 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:13:14 crc kubenswrapper[4824]: I0224 00:13:14.995373 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 00:13:15 crc kubenswrapper[4824]: I0224 00:13:15.102844 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:13:15 crc kubenswrapper[4824]: W0224 00:13:15.204271 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c0f2a4cae34d3f187f11caccad67cdd7133fe47d97e8687a9d7e96747e39a060 WatchSource:0}: Error finding container c0f2a4cae34d3f187f11caccad67cdd7133fe47d97e8687a9d7e96747e39a060: Status 404 returned error can't find the container with id c0f2a4cae34d3f187f11caccad67cdd7133fe47d97e8687a9d7e96747e39a060 Feb 24 00:13:15 crc kubenswrapper[4824]: W0224 00:13:15.302576 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e38be0f7ea2ce587e69b12592191eaf560277882839ac1ba66f2bc05f2f5b85d WatchSource:0}: Error finding container e38be0f7ea2ce587e69b12592191eaf560277882839ac1ba66f2bc05f2f5b85d: Status 404 returned error can't find the container with id e38be0f7ea2ce587e69b12592191eaf560277882839ac1ba66f2bc05f2f5b85d Feb 24 00:13:15 crc kubenswrapper[4824]: I0224 00:13:15.583819 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"152e6c347c5790dfebc84dd07ddc2793cc704355e1a548c5fda9185e071c390e"} Feb 24 00:13:15 crc kubenswrapper[4824]: I0224 00:13:15.583879 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c0f2a4cae34d3f187f11caccad67cdd7133fe47d97e8687a9d7e96747e39a060"} Feb 24 00:13:16 crc kubenswrapper[4824]: I0224 00:13:15.585858 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"69b779f2ca4324453996885939e7be3c2c022c7db2214f41b97d3b80f5afe0c5"} Feb 24 00:13:16 crc kubenswrapper[4824]: I0224 00:13:15.585913 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e38be0f7ea2ce587e69b12592191eaf560277882839ac1ba66f2bc05f2f5b85d"} Feb 24 00:13:16 crc kubenswrapper[4824]: I0224 00:13:15.586614 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:13:23 crc kubenswrapper[4824]: I0224 00:13:23.277081 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:13:23 crc kubenswrapper[4824]: I0224 00:13:23.277715 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:13:24 crc kubenswrapper[4824]: I0224 00:13:24.870466 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" podUID="9016587d-3cd5-46d7-bd50-586cd32933f7" containerName="registry" containerID="cri-o://2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795" gracePeriod=30 Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.242984 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.391882 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.391975 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392016 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392307 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392374 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdkzl\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392401 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392511 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.392567 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls\") pod \"9016587d-3cd5-46d7-bd50-586cd32933f7\" (UID: \"9016587d-3cd5-46d7-bd50-586cd32933f7\") " Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.394192 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.394209 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.400134 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.401384 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl" (OuterVolumeSpecName: "kube-api-access-cdkzl") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "kube-api-access-cdkzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.401864 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.401963 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.405570 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.409809 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9016587d-3cd5-46d7-bd50-586cd32933f7" (UID: "9016587d-3cd5-46d7-bd50-586cd32933f7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494097 4824 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494894 4824 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494931 4824 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9016587d-3cd5-46d7-bd50-586cd32933f7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494945 4824 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494959 4824 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9016587d-3cd5-46d7-bd50-586cd32933f7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494972 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdkzl\" (UniqueName: \"kubernetes.io/projected/9016587d-3cd5-46d7-bd50-586cd32933f7-kube-api-access-cdkzl\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.494983 4824 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9016587d-3cd5-46d7-bd50-586cd32933f7-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.652937 4824 generic.go:334] "Generic (PLEG): container finished" podID="9016587d-3cd5-46d7-bd50-586cd32933f7" containerID="2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795" exitCode=0 Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.652999 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" event={"ID":"9016587d-3cd5-46d7-bd50-586cd32933f7","Type":"ContainerDied","Data":"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795"} Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.653041 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" event={"ID":"9016587d-3cd5-46d7-bd50-586cd32933f7","Type":"ContainerDied","Data":"7258e3c460d9eb30e7b444c92e1cb2427c103a3e9b4014b73c4a4fe6cecde128"} Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.653072 4824 scope.go:117] "RemoveContainer" containerID="2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.653259 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ccm27" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.700328 4824 scope.go:117] "RemoveContainer" containerID="2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795" Feb 24 00:13:25 crc kubenswrapper[4824]: E0224 00:13:25.703730 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795\": container with ID starting with 2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795 not found: ID does not exist" containerID="2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.703857 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795"} err="failed to get container status \"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795\": rpc error: code = NotFound desc = could not find container \"2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795\": container with ID starting with 2298440fb6247a85651f7297a9c931c5be1fbe577d08f83bab4ec4ae10e9e795 not found: ID does not exist" Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.736106 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:13:25 crc kubenswrapper[4824]: I0224 00:13:25.741197 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ccm27"] Feb 24 00:13:26 crc kubenswrapper[4824]: I0224 00:13:26.701752 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9016587d-3cd5-46d7-bd50-586cd32933f7" path="/var/lib/kubelet/pods/9016587d-3cd5-46d7-bd50-586cd32933f7/volumes" Feb 24 00:13:45 crc kubenswrapper[4824]: I0224 00:13:45.109372 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.275961 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.276544 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.276637 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.277707 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.277837 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94" gracePeriod=600 Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.846450 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94" exitCode=0 Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.846540 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94"} Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.846843 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e"} Feb 24 00:13:53 crc kubenswrapper[4824]: I0224 00:13:53.846868 4824 scope.go:117] "RemoveContainer" containerID="13c56d6a66d6912c9ae019eb515bfeb043d9ce26a200da5330581d07cd849ba3" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.171879 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn"] Feb 24 00:15:00 crc kubenswrapper[4824]: E0224 00:15:00.172855 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9016587d-3cd5-46d7-bd50-586cd32933f7" containerName="registry" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.172877 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="9016587d-3cd5-46d7-bd50-586cd32933f7" containerName="registry" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.173024 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="9016587d-3cd5-46d7-bd50-586cd32933f7" containerName="registry" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.173683 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.177777 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.181508 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.182840 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn"] Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.223434 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.223581 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6qjz\" (UniqueName: \"kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.223682 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.325013 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6qjz\" (UniqueName: \"kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.325103 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.325137 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.326855 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.331736 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.340961 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6qjz\" (UniqueName: \"kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz\") pod \"collect-profiles-29531535-6rfpn\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.503948 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:00 crc kubenswrapper[4824]: I0224 00:15:00.712224 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn"] Feb 24 00:15:01 crc kubenswrapper[4824]: I0224 00:15:01.295845 4824 generic.go:334] "Generic (PLEG): container finished" podID="98a4dbc5-a115-4a3a-a5cb-36a037813cc0" containerID="93001c421cb7488ac129c49bbd0067fad898c23e83f878d2ef1ccb98ffc04df3" exitCode=0 Feb 24 00:15:01 crc kubenswrapper[4824]: I0224 00:15:01.295895 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" event={"ID":"98a4dbc5-a115-4a3a-a5cb-36a037813cc0","Type":"ContainerDied","Data":"93001c421cb7488ac129c49bbd0067fad898c23e83f878d2ef1ccb98ffc04df3"} Feb 24 00:15:01 crc kubenswrapper[4824]: I0224 00:15:01.295955 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" event={"ID":"98a4dbc5-a115-4a3a-a5cb-36a037813cc0","Type":"ContainerStarted","Data":"e1a569931aa90fecd5361a82afa72a79e893d766a73d1cfde3102edb0ac7ae3a"} Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.567443 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.754063 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume\") pod \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.754570 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6qjz\" (UniqueName: \"kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz\") pod \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.754750 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume\") pod \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\" (UID: \"98a4dbc5-a115-4a3a-a5cb-36a037813cc0\") " Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.755412 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume" (OuterVolumeSpecName: "config-volume") pod "98a4dbc5-a115-4a3a-a5cb-36a037813cc0" (UID: "98a4dbc5-a115-4a3a-a5cb-36a037813cc0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.759671 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz" (OuterVolumeSpecName: "kube-api-access-t6qjz") pod "98a4dbc5-a115-4a3a-a5cb-36a037813cc0" (UID: "98a4dbc5-a115-4a3a-a5cb-36a037813cc0"). InnerVolumeSpecName "kube-api-access-t6qjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.760687 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98a4dbc5-a115-4a3a-a5cb-36a037813cc0" (UID: "98a4dbc5-a115-4a3a-a5cb-36a037813cc0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.856056 4824 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.856093 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6qjz\" (UniqueName: \"kubernetes.io/projected/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-kube-api-access-t6qjz\") on node \"crc\" DevicePath \"\"" Feb 24 00:15:02 crc kubenswrapper[4824]: I0224 00:15:02.856103 4824 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98a4dbc5-a115-4a3a-a5cb-36a037813cc0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:15:03 crc kubenswrapper[4824]: I0224 00:15:03.311661 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" event={"ID":"98a4dbc5-a115-4a3a-a5cb-36a037813cc0","Type":"ContainerDied","Data":"e1a569931aa90fecd5361a82afa72a79e893d766a73d1cfde3102edb0ac7ae3a"} Feb 24 00:15:03 crc kubenswrapper[4824]: I0224 00:15:03.311730 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531535-6rfpn" Feb 24 00:15:03 crc kubenswrapper[4824]: I0224 00:15:03.311745 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a569931aa90fecd5361a82afa72a79e893d766a73d1cfde3102edb0ac7ae3a" Feb 24 00:15:53 crc kubenswrapper[4824]: I0224 00:15:53.276027 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:15:53 crc kubenswrapper[4824]: I0224 00:15:53.278010 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:16:23 crc kubenswrapper[4824]: I0224 00:16:23.276833 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:16:23 crc kubenswrapper[4824]: I0224 00:16:23.277658 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.066636 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xjg6"] Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.067990 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-controller" containerID="cri-o://0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068049 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="nbdb" containerID="cri-o://4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068214 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="northd" containerID="cri-o://f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068247 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="sbdb" containerID="cri-o://05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068330 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068389 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-acl-logging" containerID="cri-o://8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.068454 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-node" containerID="cri-o://869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.128470 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" containerID="cri-o://a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" gracePeriod=30 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.427211 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/3.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.429491 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovn-acl-logging/0.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.430050 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovn-controller/0.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.430862 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.489646 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t4spw"] Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.489907 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.489924 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.489936 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="nbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.489944 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="nbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.489959 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a4dbc5-a115-4a3a-a5cb-36a037813cc0" containerName="collect-profiles" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.489969 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a4dbc5-a115-4a3a-a5cb-36a037813cc0" containerName="collect-profiles" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.489981 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-acl-logging" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.489990 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-acl-logging" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490003 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490010 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490023 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490031 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490044 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-node" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490052 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-node" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490063 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490071 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490081 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490089 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490104 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="northd" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490112 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="northd" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490123 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="sbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490131 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="sbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490143 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kubecfg-setup" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490151 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kubecfg-setup" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490258 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490269 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490283 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="sbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490294 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a4dbc5-a115-4a3a-a5cb-36a037813cc0" containerName="collect-profiles" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490306 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490314 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-node" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490327 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="nbdb" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490340 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490352 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490363 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490373 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="northd" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490386 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovn-acl-logging" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490489 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490499 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490655 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.490758 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.490769 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" containerName="ovnkube-controller" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.492711 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550272 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550335 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550371 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6rnj\" (UniqueName: \"kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550392 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550425 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550451 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550477 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550536 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550612 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550655 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550464 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550508 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550564 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550600 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550656 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550686 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket" (OuterVolumeSpecName: "log-socket") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550693 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550813 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550856 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550897 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550922 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550956 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550997 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551020 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551051 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551076 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash\") pod \"d985b875-dd5e-4767-a4e2-209894575a8f\" (UID: \"d985b875-dd5e-4767-a4e2-209894575a8f\") " Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550853 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.550879 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551079 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551103 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551152 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log" (OuterVolumeSpecName: "node-log") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551169 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551188 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551211 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551291 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-kubelet\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551323 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551352 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-systemd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551373 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-script-lib\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551399 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-netd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551458 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551460 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87aca778-6541-4f0e-a507-ead5a3fda02b-ovn-node-metrics-cert\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551499 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551583 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash" (OuterVolumeSpecName: "host-slash") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551612 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-env-overrides\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551638 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551659 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-etc-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551719 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhggx\" (UniqueName: \"kubernetes.io/projected/87aca778-6541-4f0e-a507-ead5a3fda02b-kube-api-access-xhggx\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551777 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-systemd-units\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551808 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-slash\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551842 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-var-lib-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551897 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-ovn\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.551968 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-log-socket\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552004 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552099 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-node-log\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552122 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-config\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552149 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-netns\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552171 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-bin\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552325 4824 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-node-log\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552348 4824 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552366 4824 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552383 4824 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552399 4824 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552414 4824 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552432 4824 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552447 4824 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552466 4824 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552481 4824 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552496 4824 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552551 4824 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552568 4824 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552582 4824 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552600 4824 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552616 4824 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.552631 4824 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d985b875-dd5e-4767-a4e2-209894575a8f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.556600 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.556708 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj" (OuterVolumeSpecName: "kube-api-access-x6rnj") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "kube-api-access-x6rnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.567750 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d985b875-dd5e-4767-a4e2-209894575a8f" (UID: "d985b875-dd5e-4767-a4e2-209894575a8f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653716 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-systemd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653775 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-script-lib\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653801 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-netd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653829 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87aca778-6541-4f0e-a507-ead5a3fda02b-ovn-node-metrics-cert\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653850 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653893 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-env-overrides\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653922 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-etc-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653933 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-netd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653943 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhggx\" (UniqueName: \"kubernetes.io/projected/87aca778-6541-4f0e-a507-ead5a3fda02b-kube-api-access-xhggx\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654060 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-systemd-units\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654100 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654127 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-systemd-units\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654152 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-slash\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654056 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-etc-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.653933 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-systemd\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654224 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-var-lib-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654259 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-ovn\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654296 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-var-lib-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654309 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-log-socket\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654209 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-slash\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654345 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654395 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-ovn\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654405 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-log-socket\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654446 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-node-log\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654484 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-node-log\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654483 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-run-openvswitch\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654552 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-config\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654622 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-netns\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654626 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-env-overrides\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654646 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-bin\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654709 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-run-netns\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654789 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-cni-bin\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654836 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-kubelet\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654873 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-kubelet\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654900 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.654960 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87aca778-6541-4f0e-a507-ead5a3fda02b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.655045 4824 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d985b875-dd5e-4767-a4e2-209894575a8f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.655078 4824 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d985b875-dd5e-4767-a4e2-209894575a8f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.655109 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6rnj\" (UniqueName: \"kubernetes.io/projected/d985b875-dd5e-4767-a4e2-209894575a8f-kube-api-access-x6rnj\") on node \"crc\" DevicePath \"\"" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.655911 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-config\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.656134 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/87aca778-6541-4f0e-a507-ead5a3fda02b-ovnkube-script-lib\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.659920 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/87aca778-6541-4f0e-a507-ead5a3fda02b-ovn-node-metrics-cert\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.684850 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhggx\" (UniqueName: \"kubernetes.io/projected/87aca778-6541-4f0e-a507-ead5a3fda02b-kube-api-access-xhggx\") pod \"ovnkube-node-t4spw\" (UID: \"87aca778-6541-4f0e-a507-ead5a3fda02b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.810023 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:28 crc kubenswrapper[4824]: W0224 00:16:28.843738 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87aca778_6541_4f0e_a507_ead5a3fda02b.slice/crio-a79a38c4264a1af2b0bc4a17800759a0a464d70c47dd6c31a10502eb8c455761 WatchSource:0}: Error finding container a79a38c4264a1af2b0bc4a17800759a0a464d70c47dd6c31a10502eb8c455761: Status 404 returned error can't find the container with id a79a38c4264a1af2b0bc4a17800759a0a464d70c47dd6c31a10502eb8c455761 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.873719 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovnkube-controller/3.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.876439 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovn-acl-logging/0.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877168 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xjg6_d985b875-dd5e-4767-a4e2-209894575a8f/ovn-controller/0.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877630 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877655 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877666 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877677 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877685 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877697 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" exitCode=0 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877706 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" exitCode=143 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877715 4824 generic.go:334] "Generic (PLEG): container finished" podID="d985b875-dd5e-4767-a4e2-209894575a8f" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" exitCode=143 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877750 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877798 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877810 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877805 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877835 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.877821 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878016 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878036 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878050 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878061 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878069 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878077 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878083 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878090 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878096 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878102 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878109 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878118 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878129 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878141 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878147 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878153 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878159 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878166 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878175 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878184 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878190 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878196 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878207 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878218 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878226 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878233 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878240 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878246 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878251 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878257 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878261 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878266 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878271 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878280 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xjg6" event={"ID":"d985b875-dd5e-4767-a4e2-209894575a8f","Type":"ContainerDied","Data":"39c21b24d26f0ce7cc1f64fcb5e9960f6a2487988e095495d5e73beb90c5e099"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878288 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878293 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878299 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878304 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878308 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878313 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878318 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878322 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878327 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.878332 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.879137 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"a79a38c4264a1af2b0bc4a17800759a0a464d70c47dd6c31a10502eb8c455761"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.880487 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/2.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.880971 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/1.log" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.881003 4824 generic.go:334] "Generic (PLEG): container finished" podID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" containerID="e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68" exitCode=2 Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.881031 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerDied","Data":"e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.881049 4824 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06"} Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.881415 4824 scope.go:117] "RemoveContainer" containerID="e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68" Feb 24 00:16:28 crc kubenswrapper[4824]: E0224 00:16:28.881628 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wvqfl_openshift-multus(15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac)\"" pod="openshift-multus/multus-wvqfl" podUID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.909298 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.928626 4824 scope.go:117] "RemoveContainer" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.929195 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xjg6"] Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.933787 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xjg6"] Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.944834 4824 scope.go:117] "RemoveContainer" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.963187 4824 scope.go:117] "RemoveContainer" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:28 crc kubenswrapper[4824]: I0224 00:16:28.977029 4824 scope.go:117] "RemoveContainer" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.054375 4824 scope.go:117] "RemoveContainer" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.070211 4824 scope.go:117] "RemoveContainer" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.083942 4824 scope.go:117] "RemoveContainer" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.097136 4824 scope.go:117] "RemoveContainer" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.112688 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.113071 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.113131 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} err="failed to get container status \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.113165 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.113755 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": container with ID starting with 5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac not found: ID does not exist" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.113803 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} err="failed to get container status \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": rpc error: code = NotFound desc = could not find container \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": container with ID starting with 5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.113834 4824 scope.go:117] "RemoveContainer" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.114406 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": container with ID starting with 05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430 not found: ID does not exist" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.114467 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} err="failed to get container status \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": rpc error: code = NotFound desc = could not find container \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": container with ID starting with 05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.114489 4824 scope.go:117] "RemoveContainer" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.114968 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": container with ID starting with 4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7 not found: ID does not exist" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115000 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} err="failed to get container status \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": rpc error: code = NotFound desc = could not find container \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": container with ID starting with 4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115023 4824 scope.go:117] "RemoveContainer" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.115484 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": container with ID starting with f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04 not found: ID does not exist" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115503 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} err="failed to get container status \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": rpc error: code = NotFound desc = could not find container \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": container with ID starting with f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115531 4824 scope.go:117] "RemoveContainer" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.115932 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": container with ID starting with b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2 not found: ID does not exist" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115952 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} err="failed to get container status \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": rpc error: code = NotFound desc = could not find container \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": container with ID starting with b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.115963 4824 scope.go:117] "RemoveContainer" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.116225 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": container with ID starting with 869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8 not found: ID does not exist" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.116250 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} err="failed to get container status \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": rpc error: code = NotFound desc = could not find container \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": container with ID starting with 869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.116263 4824 scope.go:117] "RemoveContainer" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.116897 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": container with ID starting with 8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44 not found: ID does not exist" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.116935 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} err="failed to get container status \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": rpc error: code = NotFound desc = could not find container \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": container with ID starting with 8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.116956 4824 scope.go:117] "RemoveContainer" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.117470 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": container with ID starting with 0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91 not found: ID does not exist" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.117511 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} err="failed to get container status \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": rpc error: code = NotFound desc = could not find container \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": container with ID starting with 0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.117557 4824 scope.go:117] "RemoveContainer" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: E0224 00:16:29.117793 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": container with ID starting with 1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d not found: ID does not exist" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.117824 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} err="failed to get container status \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": rpc error: code = NotFound desc = could not find container \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": container with ID starting with 1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.117843 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118261 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} err="failed to get container status \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118296 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118607 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} err="failed to get container status \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": rpc error: code = NotFound desc = could not find container \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": container with ID starting with 5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118666 4824 scope.go:117] "RemoveContainer" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118939 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} err="failed to get container status \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": rpc error: code = NotFound desc = could not find container \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": container with ID starting with 05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.118966 4824 scope.go:117] "RemoveContainer" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.119209 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} err="failed to get container status \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": rpc error: code = NotFound desc = could not find container \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": container with ID starting with 4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.119236 4824 scope.go:117] "RemoveContainer" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.119694 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} err="failed to get container status \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": rpc error: code = NotFound desc = could not find container \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": container with ID starting with f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.119723 4824 scope.go:117] "RemoveContainer" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.120145 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} err="failed to get container status \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": rpc error: code = NotFound desc = could not find container \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": container with ID starting with b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.120174 4824 scope.go:117] "RemoveContainer" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.120504 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} err="failed to get container status \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": rpc error: code = NotFound desc = could not find container \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": container with ID starting with 869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.120631 4824 scope.go:117] "RemoveContainer" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121095 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} err="failed to get container status \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": rpc error: code = NotFound desc = could not find container \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": container with ID starting with 8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121123 4824 scope.go:117] "RemoveContainer" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121583 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} err="failed to get container status \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": rpc error: code = NotFound desc = could not find container \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": container with ID starting with 0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121610 4824 scope.go:117] "RemoveContainer" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121821 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} err="failed to get container status \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": rpc error: code = NotFound desc = could not find container \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": container with ID starting with 1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.121847 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122231 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} err="failed to get container status \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122259 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122536 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} err="failed to get container status \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": rpc error: code = NotFound desc = could not find container \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": container with ID starting with 5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122564 4824 scope.go:117] "RemoveContainer" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122866 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} err="failed to get container status \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": rpc error: code = NotFound desc = could not find container \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": container with ID starting with 05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.122892 4824 scope.go:117] "RemoveContainer" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123292 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} err="failed to get container status \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": rpc error: code = NotFound desc = could not find container \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": container with ID starting with 4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123323 4824 scope.go:117] "RemoveContainer" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123616 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} err="failed to get container status \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": rpc error: code = NotFound desc = could not find container \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": container with ID starting with f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123642 4824 scope.go:117] "RemoveContainer" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123911 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} err="failed to get container status \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": rpc error: code = NotFound desc = could not find container \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": container with ID starting with b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.123935 4824 scope.go:117] "RemoveContainer" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.124359 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} err="failed to get container status \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": rpc error: code = NotFound desc = could not find container \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": container with ID starting with 869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.124386 4824 scope.go:117] "RemoveContainer" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.124690 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} err="failed to get container status \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": rpc error: code = NotFound desc = could not find container \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": container with ID starting with 8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.124730 4824 scope.go:117] "RemoveContainer" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125043 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} err="failed to get container status \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": rpc error: code = NotFound desc = could not find container \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": container with ID starting with 0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125080 4824 scope.go:117] "RemoveContainer" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125568 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} err="failed to get container status \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": rpc error: code = NotFound desc = could not find container \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": container with ID starting with 1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125596 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125817 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} err="failed to get container status \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.125846 4824 scope.go:117] "RemoveContainer" containerID="5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.126386 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac"} err="failed to get container status \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": rpc error: code = NotFound desc = could not find container \"5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac\": container with ID starting with 5817c201e3633a7aa1c9aa18cf5c3bf28f2a23c56257be31e935e9e29bd522ac not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.126413 4824 scope.go:117] "RemoveContainer" containerID="05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127032 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430"} err="failed to get container status \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": rpc error: code = NotFound desc = could not find container \"05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430\": container with ID starting with 05015554a7e8c72722c7f73491f24e41b294d2038dead07caff46440b0a52430 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127060 4824 scope.go:117] "RemoveContainer" containerID="4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127376 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7"} err="failed to get container status \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": rpc error: code = NotFound desc = could not find container \"4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7\": container with ID starting with 4bdc7abfd0f25251633d17ba431c36dd07c19b7f3a701ac3ea74a12796534fa7 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127404 4824 scope.go:117] "RemoveContainer" containerID="f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127726 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04"} err="failed to get container status \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": rpc error: code = NotFound desc = could not find container \"f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04\": container with ID starting with f961e1ae6ad003099155a819d7589b3ba8d5bbdc65e7bd1fa6b52ee88c489d04 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.127753 4824 scope.go:117] "RemoveContainer" containerID="b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128151 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2"} err="failed to get container status \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": rpc error: code = NotFound desc = could not find container \"b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2\": container with ID starting with b933c4614e02f1261c4ac6b6b76c9ca4b4a0f66b500097d02f9e99cd6714c0c2 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128176 4824 scope.go:117] "RemoveContainer" containerID="869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128421 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8"} err="failed to get container status \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": rpc error: code = NotFound desc = could not find container \"869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8\": container with ID starting with 869a378050c78b5bfbce937f15dbad3e040afe6125d7cf374215ef0e1af9c2e8 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128449 4824 scope.go:117] "RemoveContainer" containerID="8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128883 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44"} err="failed to get container status \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": rpc error: code = NotFound desc = could not find container \"8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44\": container with ID starting with 8c207a8cee28dac50b2a822bec679f42de8ebbc1e3ad3fb020a6fdfc6886ac44 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.128915 4824 scope.go:117] "RemoveContainer" containerID="0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.129152 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91"} err="failed to get container status \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": rpc error: code = NotFound desc = could not find container \"0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91\": container with ID starting with 0fa8ec48b6d5542d77c867525ce787c3e7e0cd74f87cfb263a9ed31917927b91 not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.129177 4824 scope.go:117] "RemoveContainer" containerID="1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.129449 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d"} err="failed to get container status \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": rpc error: code = NotFound desc = could not find container \"1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d\": container with ID starting with 1cd4549610e3a7e2be3c5937d7b3ea2f2395accfb4af6e505168220b33cda00d not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.129474 4824 scope.go:117] "RemoveContainer" containerID="a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.129912 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee"} err="failed to get container status \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": rpc error: code = NotFound desc = could not find container \"a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee\": container with ID starting with a7e1ce655b2a5d070089d56de6a8dfa5549a4154911bfccd6be2b38df9c3b5ee not found: ID does not exist" Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.892394 4824 generic.go:334] "Generic (PLEG): container finished" podID="87aca778-6541-4f0e-a507-ead5a3fda02b" containerID="e9b5d71c6d6ab2a571df3b8c2466c1f1518b8f93d49d67a2431aaac13abdd818" exitCode=0 Feb 24 00:16:29 crc kubenswrapper[4824]: I0224 00:16:29.892470 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerDied","Data":"e9b5d71c6d6ab2a571df3b8c2466c1f1518b8f93d49d67a2431aaac13abdd818"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.702783 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d985b875-dd5e-4767-a4e2-209894575a8f" path="/var/lib/kubelet/pods/d985b875-dd5e-4767-a4e2-209894575a8f/volumes" Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910353 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"abfa01f497b5a2fef06efa2d3b4f068777bd2e7c24eea5b2af12267365af91da"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910404 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"e334e54c3bed63e8e5351cb59818891a193cf2d05dbbe6298f837bb697f7a687"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910418 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"3bba66335f2e0738f5bc158c1dac715556ad4f81990f2963122432435a924c2d"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910430 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"1a0bf0abc9b07ec27b01757009988bc9cb44d1dd824ec4b16ede5d468a11f4be"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910444 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"fa243a57aebd7ec4e6329f2bfb9eeb8b165ec5b1c0f91d4a044881000a37b7c1"} Feb 24 00:16:30 crc kubenswrapper[4824]: I0224 00:16:30.910456 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"1fa7b8e9723463aebc01d4e46d77b21c283dd62251d21446425e1df801448768"} Feb 24 00:16:33 crc kubenswrapper[4824]: I0224 00:16:33.935931 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"8c1150a736dd7bf2f64ca69df260687e576899e7e644ecb3cd1b00e9d01a6231"} Feb 24 00:16:35 crc kubenswrapper[4824]: I0224 00:16:35.954016 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" event={"ID":"87aca778-6541-4f0e-a507-ead5a3fda02b","Type":"ContainerStarted","Data":"b491b1a957c21214b1275b30eb068f5ca35e1048f7cc55ec02f64d18569a5307"} Feb 24 00:16:35 crc kubenswrapper[4824]: I0224 00:16:35.954460 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:35 crc kubenswrapper[4824]: I0224 00:16:35.954535 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:35 crc kubenswrapper[4824]: I0224 00:16:35.984834 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:35 crc kubenswrapper[4824]: I0224 00:16:35.988309 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" podStartSLOduration=7.9882871269999995 podStartE2EDuration="7.988287127s" podCreationTimestamp="2026-02-24 00:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:16:35.984469548 +0000 UTC m=+659.974094027" watchObservedRunningTime="2026-02-24 00:16:35.988287127 +0000 UTC m=+659.977911616" Feb 24 00:16:36 crc kubenswrapper[4824]: I0224 00:16:36.960923 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:36 crc kubenswrapper[4824]: I0224 00:16:36.991446 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:37 crc kubenswrapper[4824]: I0224 00:16:37.151569 4824 scope.go:117] "RemoveContainer" containerID="a39d567bc7ae642dd15ce2ea8650e1aa0fd3688e3c3c061ad145edf2bf963c06" Feb 24 00:16:37 crc kubenswrapper[4824]: I0224 00:16:37.966572 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/2.log" Feb 24 00:16:43 crc kubenswrapper[4824]: I0224 00:16:43.693746 4824 scope.go:117] "RemoveContainer" containerID="e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68" Feb 24 00:16:43 crc kubenswrapper[4824]: E0224 00:16:43.694431 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wvqfl_openshift-multus(15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac)\"" pod="openshift-multus/multus-wvqfl" podUID="15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac" Feb 24 00:16:53 crc kubenswrapper[4824]: I0224 00:16:53.276229 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:16:53 crc kubenswrapper[4824]: I0224 00:16:53.277013 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:16:53 crc kubenswrapper[4824]: I0224 00:16:53.277105 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:16:53 crc kubenswrapper[4824]: I0224 00:16:53.278227 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:16:53 crc kubenswrapper[4824]: I0224 00:16:53.278336 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e" gracePeriod=600 Feb 24 00:16:54 crc kubenswrapper[4824]: I0224 00:16:54.095818 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e" exitCode=0 Feb 24 00:16:54 crc kubenswrapper[4824]: I0224 00:16:54.095897 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e"} Feb 24 00:16:54 crc kubenswrapper[4824]: I0224 00:16:54.096131 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3"} Feb 24 00:16:54 crc kubenswrapper[4824]: I0224 00:16:54.096155 4824 scope.go:117] "RemoveContainer" containerID="ec5f29f7aaf13391c2278f1eb972e5c2f9ed40d998b7f6d08d6d97e54173df94" Feb 24 00:16:58 crc kubenswrapper[4824]: I0224 00:16:58.694160 4824 scope.go:117] "RemoveContainer" containerID="e2df584c430cf17f7bb0674c0cc149453f39f49408337d9789565a34a1bfcb68" Feb 24 00:16:58 crc kubenswrapper[4824]: I0224 00:16:58.839906 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t4spw" Feb 24 00:16:59 crc kubenswrapper[4824]: I0224 00:16:59.129383 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wvqfl_15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac/kube-multus/2.log" Feb 24 00:16:59 crc kubenswrapper[4824]: I0224 00:16:59.129444 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvqfl" event={"ID":"15b9ae43-8f87-4f2f-a8d9-b55c8fa986ac","Type":"ContainerStarted","Data":"ebbf9d60a6e27302379e600ac283f0a46e39af0887f9444dc1533d94512c6024"} Feb 24 00:17:27 crc kubenswrapper[4824]: I0224 00:17:27.867096 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:17:27 crc kubenswrapper[4824]: I0224 00:17:27.868573 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-49mft" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="registry-server" containerID="cri-o://50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43" gracePeriod=30 Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.256464 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.332673 4824 generic.go:334] "Generic (PLEG): container finished" podID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerID="50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43" exitCode=0 Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.332725 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerDied","Data":"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43"} Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.332763 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49mft" event={"ID":"a392c527-174d-4f66-a7cd-5f625192f3c7","Type":"ContainerDied","Data":"c3b19531b333b7d03df785c5e3fd25d1eeba8ccb1da22f7e993d8082b132b9a9"} Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.332781 4824 scope.go:117] "RemoveContainer" containerID="50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.332809 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49mft" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.334293 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities\") pod \"a392c527-174d-4f66-a7cd-5f625192f3c7\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.334344 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxsv\" (UniqueName: \"kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv\") pod \"a392c527-174d-4f66-a7cd-5f625192f3c7\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.334429 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content\") pod \"a392c527-174d-4f66-a7cd-5f625192f3c7\" (UID: \"a392c527-174d-4f66-a7cd-5f625192f3c7\") " Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.335684 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities" (OuterVolumeSpecName: "utilities") pod "a392c527-174d-4f66-a7cd-5f625192f3c7" (UID: "a392c527-174d-4f66-a7cd-5f625192f3c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.341576 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv" (OuterVolumeSpecName: "kube-api-access-5hxsv") pod "a392c527-174d-4f66-a7cd-5f625192f3c7" (UID: "a392c527-174d-4f66-a7cd-5f625192f3c7"). InnerVolumeSpecName "kube-api-access-5hxsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.354384 4824 scope.go:117] "RemoveContainer" containerID="c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.358878 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a392c527-174d-4f66-a7cd-5f625192f3c7" (UID: "a392c527-174d-4f66-a7cd-5f625192f3c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.370959 4824 scope.go:117] "RemoveContainer" containerID="cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.388003 4824 scope.go:117] "RemoveContainer" containerID="50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43" Feb 24 00:17:28 crc kubenswrapper[4824]: E0224 00:17:28.388624 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43\": container with ID starting with 50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43 not found: ID does not exist" containerID="50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.388677 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43"} err="failed to get container status \"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43\": rpc error: code = NotFound desc = could not find container \"50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43\": container with ID starting with 50e709672894167a7582bd8375871ba68b8ed732f6c8d86a62300ec1ec37ba43 not found: ID does not exist" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.388708 4824 scope.go:117] "RemoveContainer" containerID="c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b" Feb 24 00:17:28 crc kubenswrapper[4824]: E0224 00:17:28.389129 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b\": container with ID starting with c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b not found: ID does not exist" containerID="c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.389151 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b"} err="failed to get container status \"c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b\": rpc error: code = NotFound desc = could not find container \"c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b\": container with ID starting with c010092ad92facedf5f70ec1775c1db569552e65318cda2aa08c7010e7ba052b not found: ID does not exist" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.389163 4824 scope.go:117] "RemoveContainer" containerID="cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438" Feb 24 00:17:28 crc kubenswrapper[4824]: E0224 00:17:28.389448 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438\": container with ID starting with cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438 not found: ID does not exist" containerID="cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.389487 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438"} err="failed to get container status \"cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438\": rpc error: code = NotFound desc = could not find container \"cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438\": container with ID starting with cbad1694c39409f46b82fab5f1dbe7bae8d23677e9a40e2d515237465aad7438 not found: ID does not exist" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.435641 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.435681 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a392c527-174d-4f66-a7cd-5f625192f3c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.435691 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxsv\" (UniqueName: \"kubernetes.io/projected/a392c527-174d-4f66-a7cd-5f625192f3c7-kube-api-access-5hxsv\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.671947 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.675880 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-49mft"] Feb 24 00:17:28 crc kubenswrapper[4824]: I0224 00:17:28.699359 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" path="/var/lib/kubelet/pods/a392c527-174d-4f66-a7cd-5f625192f3c7/volumes" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.704236 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92"] Feb 24 00:17:31 crc kubenswrapper[4824]: E0224 00:17:31.704514 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="registry-server" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.704621 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="registry-server" Feb 24 00:17:31 crc kubenswrapper[4824]: E0224 00:17:31.704636 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="extract-content" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.704645 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="extract-content" Feb 24 00:17:31 crc kubenswrapper[4824]: E0224 00:17:31.704658 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="extract-utilities" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.704667 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="extract-utilities" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.704806 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="a392c527-174d-4f66-a7cd-5f625192f3c7" containerName="registry-server" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.705729 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.707908 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.716074 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92"] Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.875221 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.875317 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crzx\" (UniqueName: \"kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.875365 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.976278 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.976330 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crzx\" (UniqueName: \"kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.976363 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.976979 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:31 crc kubenswrapper[4824]: I0224 00:17:31.977015 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:32 crc kubenswrapper[4824]: I0224 00:17:32.003353 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crzx\" (UniqueName: \"kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:32 crc kubenswrapper[4824]: I0224 00:17:32.029490 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:32 crc kubenswrapper[4824]: I0224 00:17:32.245413 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92"] Feb 24 00:17:32 crc kubenswrapper[4824]: I0224 00:17:32.365126 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerStarted","Data":"35bbfc639b02e1e6a0f72cc00f210f348d0be10c023c511eb8a617921f11e775"} Feb 24 00:17:33 crc kubenswrapper[4824]: I0224 00:17:33.373160 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerStarted","Data":"8b1b7abd9a3483e4bd0c8ee3dc68d6fb0d8f48e612be434df1ee586f3cb60b45"} Feb 24 00:17:34 crc kubenswrapper[4824]: I0224 00:17:34.379831 4824 generic.go:334] "Generic (PLEG): container finished" podID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerID="8b1b7abd9a3483e4bd0c8ee3dc68d6fb0d8f48e612be434df1ee586f3cb60b45" exitCode=0 Feb 24 00:17:34 crc kubenswrapper[4824]: I0224 00:17:34.379899 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerDied","Data":"8b1b7abd9a3483e4bd0c8ee3dc68d6fb0d8f48e612be434df1ee586f3cb60b45"} Feb 24 00:17:34 crc kubenswrapper[4824]: I0224 00:17:34.382731 4824 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.090985 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf"] Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.092600 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.101069 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf"] Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.147498 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjsj\" (UniqueName: \"kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.147614 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.147753 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.249655 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjsj\" (UniqueName: \"kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.249771 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.249809 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.250331 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.250674 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.268416 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjsj\" (UniqueName: \"kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.417807 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.903538 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw"] Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.904787 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.922371 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw"] Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.960217 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.960399 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h5lz\" (UniqueName: \"kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:37 crc kubenswrapper[4824]: I0224 00:17:37.960504 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.062413 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.062589 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.062632 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h5lz\" (UniqueName: \"kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.063390 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.063405 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.084682 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h5lz\" (UniqueName: \"kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.221417 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.425876 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw"] Feb 24 00:17:38 crc kubenswrapper[4824]: I0224 00:17:38.465946 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf"] Feb 24 00:17:38 crc kubenswrapper[4824]: W0224 00:17:38.473800 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7191d6cb_0051_4cd2_a93d_a26af6142eb8.slice/crio-36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf WatchSource:0}: Error finding container 36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf: Status 404 returned error can't find the container with id 36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.411890 4824 generic.go:334] "Generic (PLEG): container finished" podID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerID="af924869ded52d4061cfbf48fbbd43b0fbd16756f85c03cfaad1ab7c3977040d" exitCode=0 Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.412044 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerDied","Data":"af924869ded52d4061cfbf48fbbd43b0fbd16756f85c03cfaad1ab7c3977040d"} Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.412083 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerStarted","Data":"3a70ebd7757696ac2b83b90dea2f46e099f4cc03587c64932de0b8718290ef9a"} Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.414272 4824 generic.go:334] "Generic (PLEG): container finished" podID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerID="496810b67b0efb6898ef84404f173e4e18bf3b537c6c09681e91f7b78d8dfe8a" exitCode=0 Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.414579 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerDied","Data":"496810b67b0efb6898ef84404f173e4e18bf3b537c6c09681e91f7b78d8dfe8a"} Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.417158 4824 generic.go:334] "Generic (PLEG): container finished" podID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerID="b8b6c3ce166e52f1d27c7cb3a651a6489b7c1dc51c75081ac1ab7350971c8f9b" exitCode=0 Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.417207 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" event={"ID":"7191d6cb-0051-4cd2-a93d-a26af6142eb8","Type":"ContainerDied","Data":"b8b6c3ce166e52f1d27c7cb3a651a6489b7c1dc51c75081ac1ab7350971c8f9b"} Feb 24 00:17:39 crc kubenswrapper[4824]: I0224 00:17:39.417238 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" event={"ID":"7191d6cb-0051-4cd2-a93d-a26af6142eb8","Type":"ContainerStarted","Data":"36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf"} Feb 24 00:17:40 crc kubenswrapper[4824]: I0224 00:17:40.425288 4824 generic.go:334] "Generic (PLEG): container finished" podID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerID="25b675d6ab9297c81a0362df0ec62867909864db3365cf4211812f0260e4c4fe" exitCode=0 Feb 24 00:17:40 crc kubenswrapper[4824]: I0224 00:17:40.425513 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerDied","Data":"25b675d6ab9297c81a0362df0ec62867909864db3365cf4211812f0260e4c4fe"} Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.669556 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.822936 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle\") pod \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.823071 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util\") pod \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.823303 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crzx\" (UniqueName: \"kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx\") pod \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\" (UID: \"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d\") " Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.825169 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle" (OuterVolumeSpecName: "bundle") pod "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" (UID: "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.832145 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx" (OuterVolumeSpecName: "kube-api-access-9crzx") pod "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" (UID: "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d"). InnerVolumeSpecName "kube-api-access-9crzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.852100 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util" (OuterVolumeSpecName: "util") pod "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" (UID: "5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.925366 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9crzx\" (UniqueName: \"kubernetes.io/projected/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-kube-api-access-9crzx\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.925390 4824 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:41 crc kubenswrapper[4824]: I0224 00:17:41.925398 4824 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d-util\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:42 crc kubenswrapper[4824]: I0224 00:17:42.442710 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" event={"ID":"5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d","Type":"ContainerDied","Data":"35bbfc639b02e1e6a0f72cc00f210f348d0be10c023c511eb8a617921f11e775"} Feb 24 00:17:42 crc kubenswrapper[4824]: I0224 00:17:42.442770 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bbfc639b02e1e6a0f72cc00f210f348d0be10c023c511eb8a617921f11e775" Feb 24 00:17:42 crc kubenswrapper[4824]: I0224 00:17:42.442911 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.104666 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq"] Feb 24 00:17:44 crc kubenswrapper[4824]: E0224 00:17:44.106199 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="util" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.106291 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="util" Feb 24 00:17:44 crc kubenswrapper[4824]: E0224 00:17:44.106370 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="pull" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.106541 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="pull" Feb 24 00:17:44 crc kubenswrapper[4824]: E0224 00:17:44.106601 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="extract" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.106677 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="extract" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.106849 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d" containerName="extract" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.107985 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.118683 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq"] Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.155449 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbnd2\" (UniqueName: \"kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.155545 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.155587 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.256825 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.256908 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.257092 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbnd2\" (UniqueName: \"kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.257437 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.257604 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.275406 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbnd2\" (UniqueName: \"kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.436436 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:44 crc kubenswrapper[4824]: I0224 00:17:44.714914 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq"] Feb 24 00:17:45 crc kubenswrapper[4824]: I0224 00:17:45.465932 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" event={"ID":"379ee973-5632-434f-953c-7f23d7dc8f9d","Type":"ContainerStarted","Data":"2df8ec581a42c6978d2dab631a2800272eb3bf9842b24962499fc7f88113c8e7"} Feb 24 00:17:46 crc kubenswrapper[4824]: I0224 00:17:46.476409 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerStarted","Data":"3bef49debf6ae8eaef2587932dd58a555a623eb36885e57bf4fd194158de2166"} Feb 24 00:17:47 crc kubenswrapper[4824]: I0224 00:17:47.487731 4824 generic.go:334] "Generic (PLEG): container finished" podID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerID="3bef49debf6ae8eaef2587932dd58a555a623eb36885e57bf4fd194158de2166" exitCode=0 Feb 24 00:17:47 crc kubenswrapper[4824]: I0224 00:17:47.487794 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerDied","Data":"3bef49debf6ae8eaef2587932dd58a555a623eb36885e57bf4fd194158de2166"} Feb 24 00:17:47 crc kubenswrapper[4824]: I0224 00:17:47.490499 4824 generic.go:334] "Generic (PLEG): container finished" podID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerID="2b933892d11c381c861cca7ebacf1915b4717e2d12d2e77f789b48f0032df3b0" exitCode=0 Feb 24 00:17:47 crc kubenswrapper[4824]: I0224 00:17:47.490563 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" event={"ID":"379ee973-5632-434f-953c-7f23d7dc8f9d","Type":"ContainerDied","Data":"2b933892d11c381c861cca7ebacf1915b4717e2d12d2e77f789b48f0032df3b0"} Feb 24 00:17:49 crc kubenswrapper[4824]: I0224 00:17:49.508396 4824 generic.go:334] "Generic (PLEG): container finished" podID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerID="4f3e7b9f784cba5e2782ce93e66affbdeefd7741a1463a2e557e7ed9b454f55d" exitCode=0 Feb 24 00:17:49 crc kubenswrapper[4824]: I0224 00:17:49.508582 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" event={"ID":"7191d6cb-0051-4cd2-a93d-a26af6142eb8","Type":"ContainerDied","Data":"4f3e7b9f784cba5e2782ce93e66affbdeefd7741a1463a2e557e7ed9b454f55d"} Feb 24 00:17:49 crc kubenswrapper[4824]: I0224 00:17:49.522162 4824 generic.go:334] "Generic (PLEG): container finished" podID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerID="4b6b8dfb291b55977edba3b2c382b669bcec9c4777ed38a9500d565db6603bc4" exitCode=0 Feb 24 00:17:49 crc kubenswrapper[4824]: I0224 00:17:49.522224 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerDied","Data":"4b6b8dfb291b55977edba3b2c382b669bcec9c4777ed38a9500d565db6603bc4"} Feb 24 00:17:50 crc kubenswrapper[4824]: I0224 00:17:50.542437 4824 generic.go:334] "Generic (PLEG): container finished" podID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerID="3e6e918d489942ec1d3edee5d48dd806e20a4c9ad46747b937d9623720eddc58" exitCode=0 Feb 24 00:17:50 crc kubenswrapper[4824]: I0224 00:17:50.542557 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" event={"ID":"7191d6cb-0051-4cd2-a93d-a26af6142eb8","Type":"ContainerDied","Data":"3e6e918d489942ec1d3edee5d48dd806e20a4c9ad46747b937d9623720eddc58"} Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.458214 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.459106 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.582892 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util\") pod \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.583220 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h5lz\" (UniqueName: \"kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz\") pod \"55bd419c-9f16-434a-9a7f-0693ab6601d4\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.583246 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzjsj\" (UniqueName: \"kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj\") pod \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.583268 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle\") pod \"55bd419c-9f16-434a-9a7f-0693ab6601d4\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.583342 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle\") pod \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\" (UID: \"7191d6cb-0051-4cd2-a93d-a26af6142eb8\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.583389 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util\") pod \"55bd419c-9f16-434a-9a7f-0693ab6601d4\" (UID: \"55bd419c-9f16-434a-9a7f-0693ab6601d4\") " Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.599687 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle" (OuterVolumeSpecName: "bundle") pod "7191d6cb-0051-4cd2-a93d-a26af6142eb8" (UID: "7191d6cb-0051-4cd2-a93d-a26af6142eb8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.601649 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle" (OuterVolumeSpecName: "bundle") pod "55bd419c-9f16-434a-9a7f-0693ab6601d4" (UID: "55bd419c-9f16-434a-9a7f-0693ab6601d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.612446 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" event={"ID":"7191d6cb-0051-4cd2-a93d-a26af6142eb8","Type":"ContainerDied","Data":"36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf"} Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.612492 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36022042664f199397e29d69a9aa82c3b910fb25a3770a289d7d8170493cb3cf" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.612576 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.615344 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" event={"ID":"55bd419c-9f16-434a-9a7f-0693ab6601d4","Type":"ContainerDied","Data":"3a70ebd7757696ac2b83b90dea2f46e099f4cc03587c64932de0b8718290ef9a"} Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.615367 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a70ebd7757696ac2b83b90dea2f46e099f4cc03587c64932de0b8718290ef9a" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.615407 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.616552 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util" (OuterVolumeSpecName: "util") pod "55bd419c-9f16-434a-9a7f-0693ab6601d4" (UID: "55bd419c-9f16-434a-9a7f-0693ab6601d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.620762 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj" (OuterVolumeSpecName: "kube-api-access-hzjsj") pod "7191d6cb-0051-4cd2-a93d-a26af6142eb8" (UID: "7191d6cb-0051-4cd2-a93d-a26af6142eb8"). InnerVolumeSpecName "kube-api-access-hzjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.633896 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util" (OuterVolumeSpecName: "util") pod "7191d6cb-0051-4cd2-a93d-a26af6142eb8" (UID: "7191d6cb-0051-4cd2-a93d-a26af6142eb8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.639422 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz" (OuterVolumeSpecName: "kube-api-access-5h5lz") pod "55bd419c-9f16-434a-9a7f-0693ab6601d4" (UID: "55bd419c-9f16-434a-9a7f-0693ab6601d4"). InnerVolumeSpecName "kube-api-access-5h5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685032 4824 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-util\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685073 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h5lz\" (UniqueName: \"kubernetes.io/projected/55bd419c-9f16-434a-9a7f-0693ab6601d4-kube-api-access-5h5lz\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685086 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzjsj\" (UniqueName: \"kubernetes.io/projected/7191d6cb-0051-4cd2-a93d-a26af6142eb8-kube-api-access-hzjsj\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685095 4824 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685102 4824 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7191d6cb-0051-4cd2-a93d-a26af6142eb8-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:53 crc kubenswrapper[4824]: I0224 00:17:53.685111 4824 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55bd419c-9f16-434a-9a7f-0693ab6601d4-util\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576438 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-df47j"] Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576726 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="util" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576744 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="util" Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576762 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576769 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576785 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="pull" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576794 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="pull" Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576808 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576815 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576829 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="util" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576836 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="util" Feb 24 00:17:54 crc kubenswrapper[4824]: E0224 00:17:54.576843 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="pull" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576850 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="pull" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576956 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="55bd419c-9f16-434a-9a7f-0693ab6601d4" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.576974 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="7191d6cb-0051-4cd2-a93d-a26af6142eb8" containerName="extract" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.577488 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.580440 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-7b6m7" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.581153 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.581549 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.592612 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-df47j"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.595808 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8xrt\" (UniqueName: \"kubernetes.io/projected/02a08fee-e933-4730-8755-7419c78d6525-kube-api-access-k8xrt\") pod \"obo-prometheus-operator-68bc856cb9-df47j\" (UID: \"02a08fee-e933-4730-8755-7419c78d6525\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.632391 4824 generic.go:334] "Generic (PLEG): container finished" podID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerID="fbf3348138844ceae4d727ff46350f42697144a3c6384b33631af75832a5090a" exitCode=0 Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.632447 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" event={"ID":"379ee973-5632-434f-953c-7f23d7dc8f9d","Type":"ContainerDied","Data":"fbf3348138844ceae4d727ff46350f42697144a3c6384b33631af75832a5090a"} Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.696435 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8xrt\" (UniqueName: \"kubernetes.io/projected/02a08fee-e933-4730-8755-7419c78d6525-kube-api-access-k8xrt\") pod \"obo-prometheus-operator-68bc856cb9-df47j\" (UID: \"02a08fee-e933-4730-8755-7419c78d6525\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.718452 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8xrt\" (UniqueName: \"kubernetes.io/projected/02a08fee-e933-4730-8755-7419c78d6525-kube-api-access-k8xrt\") pod \"obo-prometheus-operator-68bc856cb9-df47j\" (UID: \"02a08fee-e933-4730-8755-7419c78d6525\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.738472 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.739366 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.742131 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-rbpr6" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.743094 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.753613 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.755017 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.764619 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.799505 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.799648 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.799694 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.799743 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.800849 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.892944 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.901690 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.901770 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.901822 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.901869 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.907080 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.909765 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.911935 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/350461e1-7bfd-4095-9d74-4c3df3159694-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf\" (UID: \"350461e1-7bfd-4095-9d74-4c3df3159694\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.915166 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s\" (UID: \"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.959585 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hhf7q"] Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.961304 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.971910 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-2n6q8" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.972129 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 24 00:17:54 crc kubenswrapper[4824]: I0224 00:17:54.977104 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hhf7q"] Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.005119 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2mj5\" (UniqueName: \"kubernetes.io/projected/823099c2-9764-455a-a682-57c154c0d895-kube-api-access-c2mj5\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.005225 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/823099c2-9764-455a-a682-57c154c0d895-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.061806 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.091343 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.095854 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-frbxc"] Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.096593 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.111643 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-n86jq" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.112270 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2mj5\" (UniqueName: \"kubernetes.io/projected/823099c2-9764-455a-a682-57c154c0d895-kube-api-access-c2mj5\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.112325 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdh4\" (UniqueName: \"kubernetes.io/projected/885263fe-5a06-4089-b662-d3e4dbc7d08e-kube-api-access-8bdh4\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.112361 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/885263fe-5a06-4089-b662-d3e4dbc7d08e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.112391 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/823099c2-9764-455a-a682-57c154c0d895-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.116963 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/823099c2-9764-455a-a682-57c154c0d895-observability-operator-tls\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.131210 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-frbxc"] Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.140430 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2mj5\" (UniqueName: \"kubernetes.io/projected/823099c2-9764-455a-a682-57c154c0d895-kube-api-access-c2mj5\") pod \"observability-operator-59bdc8b94-hhf7q\" (UID: \"823099c2-9764-455a-a682-57c154c0d895\") " pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.213631 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdh4\" (UniqueName: \"kubernetes.io/projected/885263fe-5a06-4089-b662-d3e4dbc7d08e-kube-api-access-8bdh4\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.213728 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/885263fe-5a06-4089-b662-d3e4dbc7d08e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.215192 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/885263fe-5a06-4089-b662-d3e4dbc7d08e-openshift-service-ca\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.247305 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdh4\" (UniqueName: \"kubernetes.io/projected/885263fe-5a06-4089-b662-d3e4dbc7d08e-kube-api-access-8bdh4\") pod \"perses-operator-5bf474d74f-frbxc\" (UID: \"885263fe-5a06-4089-b662-d3e4dbc7d08e\") " pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.278337 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.389561 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-df47j"] Feb 24 00:17:55 crc kubenswrapper[4824]: W0224 00:17:55.423082 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a08fee_e933_4730_8755_7419c78d6525.slice/crio-24f184ffaaad436a493cca77ae22e0bba5d6a647fd6cfeab4abc2a00a2e13d75 WatchSource:0}: Error finding container 24f184ffaaad436a493cca77ae22e0bba5d6a647fd6cfeab4abc2a00a2e13d75: Status 404 returned error can't find the container with id 24f184ffaaad436a493cca77ae22e0bba5d6a647fd6cfeab4abc2a00a2e13d75 Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.450930 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.493544 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s"] Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.566194 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-hhf7q"] Feb 24 00:17:55 crc kubenswrapper[4824]: W0224 00:17:55.587575 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823099c2_9764_455a_a682_57c154c0d895.slice/crio-07e543b1d4d85b43dfc669fd140a601c0289bc8779bdc1d5bb9502005b69d6b6 WatchSource:0}: Error finding container 07e543b1d4d85b43dfc669fd140a601c0289bc8779bdc1d5bb9502005b69d6b6: Status 404 returned error can't find the container with id 07e543b1d4d85b43dfc669fd140a601c0289bc8779bdc1d5bb9502005b69d6b6 Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.629734 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf"] Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.639294 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" event={"ID":"02a08fee-e933-4730-8755-7419c78d6525","Type":"ContainerStarted","Data":"24f184ffaaad436a493cca77ae22e0bba5d6a647fd6cfeab4abc2a00a2e13d75"} Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.640162 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" event={"ID":"823099c2-9764-455a-a682-57c154c0d895","Type":"ContainerStarted","Data":"07e543b1d4d85b43dfc669fd140a601c0289bc8779bdc1d5bb9502005b69d6b6"} Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.646618 4824 generic.go:334] "Generic (PLEG): container finished" podID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerID="62713628ab0cf1b8ac665aa19b02a03c9b8eeac677ad129a17555f65c436b0bc" exitCode=0 Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.646712 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" event={"ID":"379ee973-5632-434f-953c-7f23d7dc8f9d","Type":"ContainerDied","Data":"62713628ab0cf1b8ac665aa19b02a03c9b8eeac677ad129a17555f65c436b0bc"} Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.652402 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" event={"ID":"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a","Type":"ContainerStarted","Data":"fe35a49ff8f75dbdc7344b5aa6d6833d5b42762e690827a1554252b0797cfd87"} Feb 24 00:17:55 crc kubenswrapper[4824]: I0224 00:17:55.915009 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-frbxc"] Feb 24 00:17:55 crc kubenswrapper[4824]: W0224 00:17:55.922525 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod885263fe_5a06_4089_b662_d3e4dbc7d08e.slice/crio-7bcb24956391b4d5f6f8854317d49fadc6064ce27b580db74115c7ba7f08e478 WatchSource:0}: Error finding container 7bcb24956391b4d5f6f8854317d49fadc6064ce27b580db74115c7ba7f08e478: Status 404 returned error can't find the container with id 7bcb24956391b4d5f6f8854317d49fadc6064ce27b580db74115c7ba7f08e478 Feb 24 00:17:56 crc kubenswrapper[4824]: I0224 00:17:56.664995 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" event={"ID":"350461e1-7bfd-4095-9d74-4c3df3159694","Type":"ContainerStarted","Data":"7ec40daae64b22521d2710cb1c76929ab29f88eb8684bc42046b3f06a20a2438"} Feb 24 00:17:56 crc kubenswrapper[4824]: I0224 00:17:56.667964 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" event={"ID":"885263fe-5a06-4089-b662-d3e4dbc7d08e","Type":"ContainerStarted","Data":"7bcb24956391b4d5f6f8854317d49fadc6064ce27b580db74115c7ba7f08e478"} Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.079452 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.150755 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbnd2\" (UniqueName: \"kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2\") pod \"379ee973-5632-434f-953c-7f23d7dc8f9d\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.150844 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util\") pod \"379ee973-5632-434f-953c-7f23d7dc8f9d\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.150987 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle\") pod \"379ee973-5632-434f-953c-7f23d7dc8f9d\" (UID: \"379ee973-5632-434f-953c-7f23d7dc8f9d\") " Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.152721 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle" (OuterVolumeSpecName: "bundle") pod "379ee973-5632-434f-953c-7f23d7dc8f9d" (UID: "379ee973-5632-434f-953c-7f23d7dc8f9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.157613 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2" (OuterVolumeSpecName: "kube-api-access-kbnd2") pod "379ee973-5632-434f-953c-7f23d7dc8f9d" (UID: "379ee973-5632-434f-953c-7f23d7dc8f9d"). InnerVolumeSpecName "kube-api-access-kbnd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.163773 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util" (OuterVolumeSpecName: "util") pod "379ee973-5632-434f-953c-7f23d7dc8f9d" (UID: "379ee973-5632-434f-953c-7f23d7dc8f9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.253134 4824 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.253187 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbnd2\" (UniqueName: \"kubernetes.io/projected/379ee973-5632-434f-953c-7f23d7dc8f9d-kube-api-access-kbnd2\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.253199 4824 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379ee973-5632-434f-953c-7f23d7dc8f9d-util\") on node \"crc\" DevicePath \"\"" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.680308 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" event={"ID":"379ee973-5632-434f-953c-7f23d7dc8f9d","Type":"ContainerDied","Data":"2df8ec581a42c6978d2dab631a2800272eb3bf9842b24962499fc7f88113c8e7"} Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.680359 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df8ec581a42c6978d2dab631a2800272eb3bf9842b24962499fc7f88113c8e7" Feb 24 00:17:57 crc kubenswrapper[4824]: I0224 00:17:57.680432 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.445307 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-7d9xq"] Feb 24 00:18:02 crc kubenswrapper[4824]: E0224 00:18:02.446118 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="pull" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.446136 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="pull" Feb 24 00:18:02 crc kubenswrapper[4824]: E0224 00:18:02.446149 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="util" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.446155 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="util" Feb 24 00:18:02 crc kubenswrapper[4824]: E0224 00:18:02.446170 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="extract" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.446177 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="extract" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.446305 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="379ee973-5632-434f-953c-7f23d7dc8f9d" containerName="extract" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.446718 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.450595 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-54htf" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.450755 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.451229 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.459624 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-7d9xq"] Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.569940 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92shp\" (UniqueName: \"kubernetes.io/projected/125693c0-b095-4b7e-9ce3-b96785d4198e-kube-api-access-92shp\") pod \"interconnect-operator-5bb49f789d-7d9xq\" (UID: \"125693c0-b095-4b7e-9ce3-b96785d4198e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.671108 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92shp\" (UniqueName: \"kubernetes.io/projected/125693c0-b095-4b7e-9ce3-b96785d4198e-kube-api-access-92shp\") pod \"interconnect-operator-5bb49f789d-7d9xq\" (UID: \"125693c0-b095-4b7e-9ce3-b96785d4198e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.700632 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92shp\" (UniqueName: \"kubernetes.io/projected/125693c0-b095-4b7e-9ce3-b96785d4198e-kube-api-access-92shp\") pod \"interconnect-operator-5bb49f789d-7d9xq\" (UID: \"125693c0-b095-4b7e-9ce3-b96785d4198e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" Feb 24 00:18:02 crc kubenswrapper[4824]: I0224 00:18:02.768493 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.693948 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-788857d49f-cs5c7"] Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.695550 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.698751 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-59f26" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.699066 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.718754 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-788857d49f-cs5c7"] Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.821003 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-webhook-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.821053 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-629r9\" (UniqueName: \"kubernetes.io/projected/43133e56-401c-453a-a59c-723bd8301fce-kube-api-access-629r9\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.821093 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-apiservice-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.895378 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-7d9xq"] Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.922424 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-apiservice-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.922567 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-webhook-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.922625 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-629r9\" (UniqueName: \"kubernetes.io/projected/43133e56-401c-453a-a59c-723bd8301fce-kube-api-access-629r9\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.938222 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-webhook-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.945390 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43133e56-401c-453a-a59c-723bd8301fce-apiservice-cert\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:05 crc kubenswrapper[4824]: I0224 00:18:05.957692 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-629r9\" (UniqueName: \"kubernetes.io/projected/43133e56-401c-453a-a59c-723bd8301fce-kube-api-access-629r9\") pod \"elastic-operator-788857d49f-cs5c7\" (UID: \"43133e56-401c-453a-a59c-723bd8301fce\") " pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.061777 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-788857d49f-cs5c7" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.510167 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-788857d49f-cs5c7"] Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.758682 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" event={"ID":"125693c0-b095-4b7e-9ce3-b96785d4198e","Type":"ContainerStarted","Data":"b01bec5372b9cc3ce96768d6fef57c3c7da367ac60e8d83d635408a6c88e89f6"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.760242 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" event={"ID":"02a08fee-e933-4730-8755-7419c78d6525","Type":"ContainerStarted","Data":"fcf1371bb8ce2cc80205fc8f654b7e6018324f4ee7dc104874aba26176cc50ee"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.761445 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" event={"ID":"350461e1-7bfd-4095-9d74-4c3df3159694","Type":"ContainerStarted","Data":"469cf9061ae35cdb46a5045e32bcd68f5d4c19f070a6379c8b6c993146006b4e"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.762484 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-788857d49f-cs5c7" event={"ID":"43133e56-401c-453a-a59c-723bd8301fce","Type":"ContainerStarted","Data":"9d1a26e622fac907f6f6b9e2185013009b4eada315908df1b22e0905ac4fe7e4"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.763763 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" event={"ID":"823099c2-9764-455a-a682-57c154c0d895","Type":"ContainerStarted","Data":"0e9fe346b6722a8934542c82f6e4c0d4afae3e66748496f119aeb2f964d68642"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.764426 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.766194 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" event={"ID":"7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a","Type":"ContainerStarted","Data":"fe0f410b8e4180707913c032dbf885915f74c199aaa682de4fbbcb68912bec25"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.768998 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" event={"ID":"885263fe-5a06-4089-b662-d3e4dbc7d08e","Type":"ContainerStarted","Data":"ce2508a2ea2ff5db86300d0cac7cebdec925662895fa099c1a5d4c296344b739"} Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.769387 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.780888 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-df47j" podStartSLOduration=2.672208942 podStartE2EDuration="12.780864244s" podCreationTimestamp="2026-02-24 00:17:54 +0000 UTC" firstStartedPulling="2026-02-24 00:17:55.427072256 +0000 UTC m=+739.416696725" lastFinishedPulling="2026-02-24 00:18:05.535727558 +0000 UTC m=+749.525352027" observedRunningTime="2026-02-24 00:18:06.775683421 +0000 UTC m=+750.765307890" watchObservedRunningTime="2026-02-24 00:18:06.780864244 +0000 UTC m=+750.770488723" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.788893 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.802046 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" podStartSLOduration=2.229990546 podStartE2EDuration="11.802029557s" podCreationTimestamp="2026-02-24 00:17:55 +0000 UTC" firstStartedPulling="2026-02-24 00:17:55.925495886 +0000 UTC m=+739.915120345" lastFinishedPulling="2026-02-24 00:18:05.497534887 +0000 UTC m=+749.487159356" observedRunningTime="2026-02-24 00:18:06.799822781 +0000 UTC m=+750.789447250" watchObservedRunningTime="2026-02-24 00:18:06.802029557 +0000 UTC m=+750.791654026" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.831745 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-hhf7q" podStartSLOduration=2.88860733 podStartE2EDuration="12.83172331s" podCreationTimestamp="2026-02-24 00:17:54 +0000 UTC" firstStartedPulling="2026-02-24 00:17:55.59465367 +0000 UTC m=+739.584278139" lastFinishedPulling="2026-02-24 00:18:05.53776965 +0000 UTC m=+749.527394119" observedRunningTime="2026-02-24 00:18:06.829743359 +0000 UTC m=+750.819367838" watchObservedRunningTime="2026-02-24 00:18:06.83172331 +0000 UTC m=+750.821347779" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.858796 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s" podStartSLOduration=2.905622326 podStartE2EDuration="12.858773314s" podCreationTimestamp="2026-02-24 00:17:54 +0000 UTC" firstStartedPulling="2026-02-24 00:17:55.51171837 +0000 UTC m=+739.501342839" lastFinishedPulling="2026-02-24 00:18:05.464869358 +0000 UTC m=+749.454493827" observedRunningTime="2026-02-24 00:18:06.854273299 +0000 UTC m=+750.843897768" watchObservedRunningTime="2026-02-24 00:18:06.858773314 +0000 UTC m=+750.848397783" Feb 24 00:18:06 crc kubenswrapper[4824]: I0224 00:18:06.880186 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf" podStartSLOduration=3.050625551 podStartE2EDuration="12.880166594s" podCreationTimestamp="2026-02-24 00:17:54 +0000 UTC" firstStartedPulling="2026-02-24 00:17:55.646156223 +0000 UTC m=+739.635780692" lastFinishedPulling="2026-02-24 00:18:05.475697266 +0000 UTC m=+749.465321735" observedRunningTime="2026-02-24 00:18:06.878619204 +0000 UTC m=+750.868243703" watchObservedRunningTime="2026-02-24 00:18:06.880166594 +0000 UTC m=+750.869791063" Feb 24 00:18:09 crc kubenswrapper[4824]: I0224 00:18:09.790345 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-788857d49f-cs5c7" event={"ID":"43133e56-401c-453a-a59c-723bd8301fce","Type":"ContainerStarted","Data":"bbdb91bd5a692aeab89a376c7905813cf7aebf46e06652d8c2e2094e4736d44c"} Feb 24 00:18:09 crc kubenswrapper[4824]: I0224 00:18:09.811711 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-788857d49f-cs5c7" podStartSLOduration=2.169936056 podStartE2EDuration="4.811687206s" podCreationTimestamp="2026-02-24 00:18:05 +0000 UTC" firstStartedPulling="2026-02-24 00:18:06.537872024 +0000 UTC m=+750.527496493" lastFinishedPulling="2026-02-24 00:18:09.179623174 +0000 UTC m=+753.169247643" observedRunningTime="2026-02-24 00:18:09.808088823 +0000 UTC m=+753.797713302" watchObservedRunningTime="2026-02-24 00:18:09.811687206 +0000 UTC m=+753.801311675" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.216467 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd"] Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.217773 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.219654 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.219985 4824 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-jkwbj" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.220112 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.239439 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd"] Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.323895 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af381dba-8d03-4a1d-94a5-cd8a45dbc318-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.323966 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2pt\" (UniqueName: \"kubernetes.io/projected/af381dba-8d03-4a1d-94a5-cd8a45dbc318-kube-api-access-7b2pt\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.425758 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2pt\" (UniqueName: \"kubernetes.io/projected/af381dba-8d03-4a1d-94a5-cd8a45dbc318-kube-api-access-7b2pt\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.425925 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af381dba-8d03-4a1d-94a5-cd8a45dbc318-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.426684 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af381dba-8d03-4a1d-94a5-cd8a45dbc318-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.462824 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2pt\" (UniqueName: \"kubernetes.io/projected/af381dba-8d03-4a1d-94a5-cd8a45dbc318-kube-api-access-7b2pt\") pod \"cert-manager-operator-controller-manager-5586865c96-p78vd\" (UID: \"af381dba-8d03-4a1d-94a5-cd8a45dbc318\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.533817 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" Feb 24 00:18:13 crc kubenswrapper[4824]: I0224 00:18:13.835004 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd"] Feb 24 00:18:13 crc kubenswrapper[4824]: W0224 00:18:13.852241 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf381dba_8d03_4a1d_94a5_cd8a45dbc318.slice/crio-36d10d05706634f48ff722062f8ec4e4c41eb1fcfe9f039c0e2e620d7ad648b0 WatchSource:0}: Error finding container 36d10d05706634f48ff722062f8ec4e4c41eb1fcfe9f039c0e2e620d7ad648b0: Status 404 returned error can't find the container with id 36d10d05706634f48ff722062f8ec4e4c41eb1fcfe9f039c0e2e620d7ad648b0 Feb 24 00:18:14 crc kubenswrapper[4824]: I0224 00:18:14.827384 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" event={"ID":"af381dba-8d03-4a1d-94a5-cd8a45dbc318","Type":"ContainerStarted","Data":"36d10d05706634f48ff722062f8ec4e4c41eb1fcfe9f039c0e2e620d7ad648b0"} Feb 24 00:18:15 crc kubenswrapper[4824]: I0224 00:18:15.454114 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-frbxc" Feb 24 00:18:17 crc kubenswrapper[4824]: I0224 00:18:17.849096 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" event={"ID":"af381dba-8d03-4a1d-94a5-cd8a45dbc318","Type":"ContainerStarted","Data":"06419dbb9409c63c96175fc68d0b4512ecd169b3b09187a7ac9cb013c2a4d0d8"} Feb 24 00:18:17 crc kubenswrapper[4824]: I0224 00:18:17.871826 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-p78vd" podStartSLOduration=1.755610607 podStartE2EDuration="4.871808802s" podCreationTimestamp="2026-02-24 00:18:13 +0000 UTC" firstStartedPulling="2026-02-24 00:18:13.856109677 +0000 UTC m=+757.845734146" lastFinishedPulling="2026-02-24 00:18:16.972307872 +0000 UTC m=+760.961932341" observedRunningTime="2026-02-24 00:18:17.867758738 +0000 UTC m=+761.857383217" watchObservedRunningTime="2026-02-24 00:18:17.871808802 +0000 UTC m=+761.861433271" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.885972 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.887415 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.893333 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.893372 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Feb 24 00:18:20 crc kubenswrapper[4824]: W0224 00:18:20.893590 4824 reflector.go:561] object-"service-telemetry"/"default-dockercfg-ddnnb": failed to list *v1.Secret: secrets "default-dockercfg-ddnnb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Feb 24 00:18:20 crc kubenswrapper[4824]: E0224 00:18:20.893636 4824 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"default-dockercfg-ddnnb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-ddnnb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.894835 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.895439 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.895849 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Feb 24 00:18:20 crc kubenswrapper[4824]: W0224 00:18:20.895881 4824 reflector.go:561] object-"service-telemetry"/"elasticsearch-es-scripts": failed to list *v1.ConfigMap: configmaps "elasticsearch-es-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "service-telemetry": no relationship found between node 'crc' and this object Feb 24 00:18:20 crc kubenswrapper[4824]: E0224 00:18:20.895932 4824 reflector.go:158] "Unhandled Error" err="object-\"service-telemetry\"/\"elasticsearch-es-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"elasticsearch-es-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"service-telemetry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.896512 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.901153 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.915015 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.944854 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.944924 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.944954 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.944979 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945010 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945030 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/96f9c835-f7c9-4774-9b95-8911ab4ffb23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945052 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945077 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945103 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945137 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945160 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945187 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945214 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945242 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:20 crc kubenswrapper[4824]: I0224 00:18:20.945268 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046577 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046657 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046686 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046708 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046732 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046753 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/96f9c835-f7c9-4774-9b95-8911ab4ffb23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046772 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046798 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046821 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046850 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046871 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046894 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046922 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046949 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.046973 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.047825 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.048530 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.050128 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.050218 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.050656 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.051357 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.051671 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.055703 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/96f9c835-f7c9-4774-9b95-8911ab4ffb23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.055765 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.056043 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.056161 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.056281 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.056755 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.058343 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.698617 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-m8rqb"] Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.699646 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.705834 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.706012 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.706145 4824 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vsvqw" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.718947 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-m8rqb"] Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.781679 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.782197 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwd8b\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-kube-api-access-jwd8b\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.883331 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.883436 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwd8b\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-kube-api-access-jwd8b\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.919565 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-ddnnb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.922143 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:21 crc kubenswrapper[4824]: I0224 00:18:21.922903 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwd8b\" (UniqueName: \"kubernetes.io/projected/ad293038-bf1d-4800-bd32-9488c5f19e95-kube-api-access-jwd8b\") pod \"cert-manager-webhook-6888856db4-m8rqb\" (UID: \"ad293038-bf1d-4800-bd32-9488c5f19e95\") " pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.013754 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:22 crc kubenswrapper[4824]: E0224 00:18:22.050365 4824 configmap.go:193] Couldn't get configMap service-telemetry/elasticsearch-es-scripts: failed to sync configmap cache: timed out waiting for the condition Feb 24 00:18:22 crc kubenswrapper[4824]: E0224 00:18:22.050459 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts podName:96f9c835-f7c9-4774-9b95-8911ab4ffb23 nodeName:}" failed. No retries permitted until 2026-02-24 00:18:22.550441289 +0000 UTC m=+766.540065758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "elastic-internal-scripts" (UniqueName: "kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts") pod "elasticsearch-es-default-0" (UID: "96f9c835-f7c9-4774-9b95-8911ab4ffb23") : failed to sync configmap cache: timed out waiting for the condition Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.448790 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.528752 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-m8rqb"] Feb 24 00:18:22 crc kubenswrapper[4824]: W0224 00:18:22.565113 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad293038_bf1d_4800_bd32_9488c5f19e95.slice/crio-bdb463b23ac5ac9e9024eb9aef4baee7f3b4e4f17c968fa5d992aeb24602eeb8 WatchSource:0}: Error finding container bdb463b23ac5ac9e9024eb9aef4baee7f3b4e4f17c968fa5d992aeb24602eeb8: Status 404 returned error can't find the container with id bdb463b23ac5ac9e9024eb9aef4baee7f3b4e4f17c968fa5d992aeb24602eeb8 Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.593915 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.595581 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/96f9c835-f7c9-4774-9b95-8911ab4ffb23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"96f9c835-f7c9-4774-9b95-8911ab4ffb23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.706381 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:22 crc kubenswrapper[4824]: I0224 00:18:22.886495 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" event={"ID":"ad293038-bf1d-4800-bd32-9488c5f19e95","Type":"ContainerStarted","Data":"bdb463b23ac5ac9e9024eb9aef4baee7f3b4e4f17c968fa5d992aeb24602eeb8"} Feb 24 00:18:23 crc kubenswrapper[4824]: I0224 00:18:23.256310 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 24 00:18:23 crc kubenswrapper[4824]: W0224 00:18:23.272152 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96f9c835_f7c9_4774_9b95_8911ab4ffb23.slice/crio-a53751883953d3360125344cc83cacee9f03420695f0b2faf50f55b76b645c31 WatchSource:0}: Error finding container a53751883953d3360125344cc83cacee9f03420695f0b2faf50f55b76b645c31: Status 404 returned error can't find the container with id a53751883953d3360125344cc83cacee9f03420695f0b2faf50f55b76b645c31 Feb 24 00:18:23 crc kubenswrapper[4824]: I0224 00:18:23.896381 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"96f9c835-f7c9-4774-9b95-8911ab4ffb23","Type":"ContainerStarted","Data":"a53751883953d3360125344cc83cacee9f03420695f0b2faf50f55b76b645c31"} Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.154370 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qhzcr"] Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.155819 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.173456 4824 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hwctf" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.179238 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qhzcr"] Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.225847 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.225937 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkvb\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-kube-api-access-xbkvb\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.326804 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkvb\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-kube-api-access-xbkvb\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.326967 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.346556 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkvb\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-kube-api-access-xbkvb\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.346980 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/219daf0d-f400-4a2c-8374-5c23e10c27a6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-qhzcr\" (UID: \"219daf0d-f400-4a2c-8374-5c23e10c27a6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:24 crc kubenswrapper[4824]: I0224 00:18:24.496501 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" Feb 24 00:18:25 crc kubenswrapper[4824]: I0224 00:18:25.279709 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-qhzcr"] Feb 24 00:18:25 crc kubenswrapper[4824]: W0224 00:18:25.292831 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod219daf0d_f400_4a2c_8374_5c23e10c27a6.slice/crio-111e6a81e36b2380ac3e3ce8cd8839b9ae71c986b50bc5d40bdd120bb29d0cca WatchSource:0}: Error finding container 111e6a81e36b2380ac3e3ce8cd8839b9ae71c986b50bc5d40bdd120bb29d0cca: Status 404 returned error can't find the container with id 111e6a81e36b2380ac3e3ce8cd8839b9ae71c986b50bc5d40bdd120bb29d0cca Feb 24 00:18:25 crc kubenswrapper[4824]: I0224 00:18:25.935501 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" event={"ID":"219daf0d-f400-4a2c-8374-5c23e10c27a6","Type":"ContainerStarted","Data":"111e6a81e36b2380ac3e3ce8cd8839b9ae71c986b50bc5d40bdd120bb29d0cca"} Feb 24 00:18:31 crc kubenswrapper[4824]: I0224 00:18:31.953276 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9mpql"] Feb 24 00:18:31 crc kubenswrapper[4824]: I0224 00:18:31.954788 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:31 crc kubenswrapper[4824]: I0224 00:18:31.959350 4824 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-t6lmx" Feb 24 00:18:31 crc kubenswrapper[4824]: I0224 00:18:31.963548 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9mpql"] Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.066553 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm8nb\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-kube-api-access-cm8nb\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.066609 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-bound-sa-token\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.167495 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-bound-sa-token\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.167606 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm8nb\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-kube-api-access-cm8nb\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.202673 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm8nb\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-kube-api-access-cm8nb\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.215443 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f370348-c40e-4096-98c1-d681f34b8659-bound-sa-token\") pod \"cert-manager-545d4d4674-9mpql\" (UID: \"1f370348-c40e-4096-98c1-d681f34b8659\") " pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:32 crc kubenswrapper[4824]: I0224 00:18:32.271020 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9mpql" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.906982 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.908080 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-webhook,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671,Command:[/app/cmd/webhook/webhook],Args:[--dynamic-serving-ca-secret-name=cert-manager-webhook-ca --dynamic-serving-ca-secret-namespace=$(POD_NAMESPACE) --dynamic-serving-dns-names=cert-manager-webhook,cert-manager-webhook.$(POD_NAMESPACE),cert-manager-webhook.$(POD_NAMESPACE).svc --secure-port=10250 --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:10250,Protocol:TCP,HostIP:,},ContainerPort{Name:healthcheck,HostPort:0,ContainerPort:6080,Protocol:TCP,HostIP:,},ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwd8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{1 0 healthcheck},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-webhook-6888856db4-m8rqb_cert-manager(ad293038-bf1d-4800-bd32-9488c5f19e95): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.910956 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" podUID="ad293038-bf1d-4800-bd32-9488c5f19e95" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.911284 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.911671 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-cainjector,Image:registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671,Command:[/app/cmd/cainjector/cainjector],Args:[--leader-election-namespace=kube-system --v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbkvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000690000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-cainjector-5545bd876-qhzcr_cert-manager(219daf0d-f400-4a2c-8374-5c23e10c27a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:18:38 crc kubenswrapper[4824]: E0224 00:18:38.913938 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" podUID="219daf0d-f400-4a2c-8374-5c23e10c27a6" Feb 24 00:18:39 crc kubenswrapper[4824]: E0224 00:18:39.022307 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-cainjector\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671\\\"\"" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" podUID="219daf0d-f400-4a2c-8374-5c23e10c27a6" Feb 24 00:18:39 crc kubenswrapper[4824]: E0224 00:18:39.022344 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/jetstack-cert-manager-rhel9@sha256:903ce74138b1ffc735846a7c5fcdf62bbe82ca29568a6b38caec2656f6637671\\\"\"" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" podUID="ad293038-bf1d-4800-bd32-9488c5f19e95" Feb 24 00:18:39 crc kubenswrapper[4824]: E0224 00:18:39.324866 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Feb 24 00:18:39 crc kubenswrapper[4824]: E0224 00:18:39.325216 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-92shp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-7d9xq_service-telemetry(125693c0-b095-4b7e-9ce3-b96785d4198e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 00:18:39 crc kubenswrapper[4824]: E0224 00:18:39.326764 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" podUID="125693c0-b095-4b7e-9ce3-b96785d4198e" Feb 24 00:18:40 crc kubenswrapper[4824]: E0224 00:18:40.032220 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" podUID="125693c0-b095-4b7e-9ce3-b96785d4198e" Feb 24 00:18:43 crc kubenswrapper[4824]: W0224 00:18:43.351475 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f370348_c40e_4096_98c1_d681f34b8659.slice/crio-88a05b1c998aacc24d685d63410da86a25088b3385f47b39deddbcb95a3277d8 WatchSource:0}: Error finding container 88a05b1c998aacc24d685d63410da86a25088b3385f47b39deddbcb95a3277d8: Status 404 returned error can't find the container with id 88a05b1c998aacc24d685d63410da86a25088b3385f47b39deddbcb95a3277d8 Feb 24 00:18:43 crc kubenswrapper[4824]: I0224 00:18:43.354889 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9mpql"] Feb 24 00:18:44 crc kubenswrapper[4824]: I0224 00:18:44.059381 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"96f9c835-f7c9-4774-9b95-8911ab4ffb23","Type":"ContainerStarted","Data":"98348fc60c95f3b72a13eee8026e154755adf6da14a38680e033e1765f7ba472"} Feb 24 00:18:44 crc kubenswrapper[4824]: I0224 00:18:44.063242 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9mpql" event={"ID":"1f370348-c40e-4096-98c1-d681f34b8659","Type":"ContainerStarted","Data":"88a05b1c998aacc24d685d63410da86a25088b3385f47b39deddbcb95a3277d8"} Feb 24 00:18:44 crc kubenswrapper[4824]: I0224 00:18:44.258033 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 24 00:18:44 crc kubenswrapper[4824]: I0224 00:18:44.313290 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 24 00:18:45 crc kubenswrapper[4824]: I0224 00:18:45.072904 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9mpql" event={"ID":"1f370348-c40e-4096-98c1-d681f34b8659","Type":"ContainerStarted","Data":"33970f108535c8aa9f059b01e3c63ac325765820364921cdfcaf114319016f24"} Feb 24 00:18:45 crc kubenswrapper[4824]: I0224 00:18:45.076785 4824 generic.go:334] "Generic (PLEG): container finished" podID="96f9c835-f7c9-4774-9b95-8911ab4ffb23" containerID="98348fc60c95f3b72a13eee8026e154755adf6da14a38680e033e1765f7ba472" exitCode=0 Feb 24 00:18:45 crc kubenswrapper[4824]: I0224 00:18:45.076883 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"96f9c835-f7c9-4774-9b95-8911ab4ffb23","Type":"ContainerDied","Data":"98348fc60c95f3b72a13eee8026e154755adf6da14a38680e033e1765f7ba472"} Feb 24 00:18:45 crc kubenswrapper[4824]: I0224 00:18:45.099629 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9mpql" podStartSLOduration=12.968534443 podStartE2EDuration="14.099602862s" podCreationTimestamp="2026-02-24 00:18:31 +0000 UTC" firstStartedPulling="2026-02-24 00:18:43.354566619 +0000 UTC m=+787.344191088" lastFinishedPulling="2026-02-24 00:18:44.485635038 +0000 UTC m=+788.475259507" observedRunningTime="2026-02-24 00:18:45.093260163 +0000 UTC m=+789.082884652" watchObservedRunningTime="2026-02-24 00:18:45.099602862 +0000 UTC m=+789.089227331" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.085295 4824 generic.go:334] "Generic (PLEG): container finished" podID="96f9c835-f7c9-4774-9b95-8911ab4ffb23" containerID="42e1a62a5519ed4aad2873a43d577e42a2ffc35c648851add4f4846bd1dfb329" exitCode=0 Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.087732 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"96f9c835-f7c9-4774-9b95-8911ab4ffb23","Type":"ContainerDied","Data":"42e1a62a5519ed4aad2873a43d577e42a2ffc35c648851add4f4846bd1dfb329"} Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.813904 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.815250 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.818294 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.818412 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.819962 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.827259 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.829979 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.994785 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d647\" (UniqueName: \"kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.994857 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.994883 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995040 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995140 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995213 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995265 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995306 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995355 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995470 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995508 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:46 crc kubenswrapper[4824]: I0224 00:18:46.995691 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.095746 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"96f9c835-f7c9-4774-9b95-8911ab4ffb23","Type":"ContainerStarted","Data":"8b87192a94b84755847233b1f93d248ab45a5f5eb0ed3c88fbac47f8bf0fdb77"} Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096037 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096488 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096581 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096618 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096643 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096669 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096693 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096753 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096800 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096832 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096871 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d647\" (UniqueName: \"kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096900 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.096920 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.097884 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098217 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098452 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098467 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098740 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098828 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098828 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.098905 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.099255 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.108037 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.120925 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d647\" (UniqueName: \"kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.124061 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-1-build\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.133834 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.140327 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=7.11710773 podStartE2EDuration="27.140305837s" podCreationTimestamp="2026-02-24 00:18:20 +0000 UTC" firstStartedPulling="2026-02-24 00:18:23.282062947 +0000 UTC m=+767.271687426" lastFinishedPulling="2026-02-24 00:18:43.305261034 +0000 UTC m=+787.294885533" observedRunningTime="2026-02-24 00:18:47.139839125 +0000 UTC m=+791.129463594" watchObservedRunningTime="2026-02-24 00:18:47.140305837 +0000 UTC m=+791.129930306" Feb 24 00:18:47 crc kubenswrapper[4824]: I0224 00:18:47.457039 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:18:47 crc kubenswrapper[4824]: W0224 00:18:47.463096 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1b50786_48f9_4f1a_bf8b_4686f9baae85.slice/crio-d08a403612b11c8bf91f5e3516b788a4c4942b41c820856155d377d7e8361b75 WatchSource:0}: Error finding container d08a403612b11c8bf91f5e3516b788a4c4942b41c820856155d377d7e8361b75: Status 404 returned error can't find the container with id d08a403612b11c8bf91f5e3516b788a4c4942b41c820856155d377d7e8361b75 Feb 24 00:18:48 crc kubenswrapper[4824]: I0224 00:18:48.104798 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"b1b50786-48f9-4f1a-bf8b-4686f9baae85","Type":"ContainerStarted","Data":"d08a403612b11c8bf91f5e3516b788a4c4942b41c820856155d377d7e8361b75"} Feb 24 00:18:53 crc kubenswrapper[4824]: I0224 00:18:53.275742 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:18:53 crc kubenswrapper[4824]: I0224 00:18:53.276635 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.168146 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" event={"ID":"125693c0-b095-4b7e-9ce3-b96785d4198e","Type":"ContainerStarted","Data":"6524e4c3b507642b50a96cffbe386804f5b8232b06ea6f927c77136198a8a496"} Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.169371 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" event={"ID":"219daf0d-f400-4a2c-8374-5c23e10c27a6","Type":"ContainerStarted","Data":"871c5880f89e2ca8dce6b4a9fb09ca48dac5394160322047f990e07ba8448b18"} Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.171821 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" event={"ID":"ad293038-bf1d-4800-bd32-9488c5f19e95","Type":"ContainerStarted","Data":"a7ff395be8da0def82507a88e46b01ea6cba5773d09ca6fe2c05f85b04310f1d"} Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.172003 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.173773 4824 generic.go:334] "Generic (PLEG): container finished" podID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerID="ab94cc45a4691d159d5d48b21f724973db63d5bd87b6d0a0e4494040c5bc84b1" exitCode=0 Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.173805 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"b1b50786-48f9-4f1a-bf8b-4686f9baae85","Type":"ContainerDied","Data":"ab94cc45a4691d159d5d48b21f724973db63d5bd87b6d0a0e4494040c5bc84b1"} Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.257310 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-7d9xq" podStartSLOduration=5.370088689 podStartE2EDuration="52.257277208s" podCreationTimestamp="2026-02-24 00:18:02 +0000 UTC" firstStartedPulling="2026-02-24 00:18:05.930014383 +0000 UTC m=+749.919638852" lastFinishedPulling="2026-02-24 00:18:52.817202882 +0000 UTC m=+796.806827371" observedRunningTime="2026-02-24 00:18:54.196931385 +0000 UTC m=+798.186555874" watchObservedRunningTime="2026-02-24 00:18:54.257277208 +0000 UTC m=+798.246901677" Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.274653 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-qhzcr" podStartSLOduration=-9223372006.580164 podStartE2EDuration="30.274612103s" podCreationTimestamp="2026-02-24 00:18:24 +0000 UTC" firstStartedPulling="2026-02-24 00:18:25.306249059 +0000 UTC m=+769.295873528" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:18:54.272554052 +0000 UTC m=+798.262178541" watchObservedRunningTime="2026-02-24 00:18:54.274612103 +0000 UTC m=+798.264236572" Feb 24 00:18:54 crc kubenswrapper[4824]: I0224 00:18:54.304738 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" podStartSLOduration=-9223372003.550068 podStartE2EDuration="33.304708838s" podCreationTimestamp="2026-02-24 00:18:21 +0000 UTC" firstStartedPulling="2026-02-24 00:18:22.580681165 +0000 UTC m=+766.570305624" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:18:54.303778564 +0000 UTC m=+798.293403043" watchObservedRunningTime="2026-02-24 00:18:54.304708838 +0000 UTC m=+798.294333327" Feb 24 00:18:55 crc kubenswrapper[4824]: I0224 00:18:55.183563 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"b1b50786-48f9-4f1a-bf8b-4686f9baae85","Type":"ContainerStarted","Data":"8766c6c73cf28ccc576c83f82e20c6fe70a6664d9a7a04a13302eaacbca67f41"} Feb 24 00:18:55 crc kubenswrapper[4824]: I0224 00:18:55.215813 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.863134537 podStartE2EDuration="9.215787982s" podCreationTimestamp="2026-02-24 00:18:46 +0000 UTC" firstStartedPulling="2026-02-24 00:18:47.492557529 +0000 UTC m=+791.482182008" lastFinishedPulling="2026-02-24 00:18:52.845210984 +0000 UTC m=+796.834835453" observedRunningTime="2026-02-24 00:18:55.210284274 +0000 UTC m=+799.199908753" watchObservedRunningTime="2026-02-24 00:18:55.215787982 +0000 UTC m=+799.205412451" Feb 24 00:18:57 crc kubenswrapper[4824]: I0224 00:18:57.233544 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:18:57 crc kubenswrapper[4824]: I0224 00:18:57.234227 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="docker-build" containerID="cri-o://8766c6c73cf28ccc576c83f82e20c6fe70a6664d9a7a04a13302eaacbca67f41" gracePeriod=30 Feb 24 00:18:57 crc kubenswrapper[4824]: I0224 00:18:57.801560 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="96f9c835-f7c9-4774-9b95-8911ab4ffb23" containerName="elasticsearch" probeResult="failure" output=< Feb 24 00:18:57 crc kubenswrapper[4824]: {"timestamp": "2026-02-24T00:18:57+00:00", "message": "readiness probe failed", "curl_rc": "7"} Feb 24 00:18:57 crc kubenswrapper[4824]: > Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.131895 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.133431 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.138304 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.138451 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.138655 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.155111 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.222457 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.222573 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.222777 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.222888 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.222945 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223084 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223168 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223242 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7jd\" (UniqueName: \"kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223305 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223387 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223433 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.223462 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.325570 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326020 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326130 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326151 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326324 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326416 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326501 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r7jd\" (UniqueName: \"kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326635 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326726 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326804 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326884 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326969 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.327054 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326886 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326792 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.326804 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.327264 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.327288 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.327305 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.327825 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.328018 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.346101 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.346105 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.350865 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r7jd\" (UniqueName: \"kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd\") pod \"service-telemetry-operator-2-build\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:18:59 crc kubenswrapper[4824]: I0224 00:18:59.456752 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:19:00 crc kubenswrapper[4824]: I0224 00:19:00.637981 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:19:00 crc kubenswrapper[4824]: W0224 00:19:00.651640 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc20437c_c977_4543_a681_cda1af5c3583.slice/crio-fd9408fdeee02b30a3fa42083cf9391e2cb143354c82385b1e3d293e48b613e4 WatchSource:0}: Error finding container fd9408fdeee02b30a3fa42083cf9391e2cb143354c82385b1e3d293e48b613e4: Status 404 returned error can't find the container with id fd9408fdeee02b30a3fa42083cf9391e2cb143354c82385b1e3d293e48b613e4 Feb 24 00:19:01 crc kubenswrapper[4824]: I0224 00:19:01.232663 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"cc20437c-c977-4543-a681-cda1af5c3583","Type":"ContainerStarted","Data":"fd9408fdeee02b30a3fa42083cf9391e2cb143354c82385b1e3d293e48b613e4"} Feb 24 00:19:02 crc kubenswrapper[4824]: I0224 00:19:02.017536 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-m8rqb" Feb 24 00:19:02 crc kubenswrapper[4824]: I0224 00:19:02.771938 4824 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="96f9c835-f7c9-4774-9b95-8911ab4ffb23" containerName="elasticsearch" probeResult="failure" output=< Feb 24 00:19:02 crc kubenswrapper[4824]: {"timestamp": "2026-02-24T00:19:02+00:00", "message": "readiness probe failed", "curl_rc": "7"} Feb 24 00:19:02 crc kubenswrapper[4824]: > Feb 24 00:19:03 crc kubenswrapper[4824]: I0224 00:19:03.948246 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_b1b50786-48f9-4f1a-bf8b-4686f9baae85/docker-build/0.log" Feb 24 00:19:03 crc kubenswrapper[4824]: I0224 00:19:03.949642 4824 generic.go:334] "Generic (PLEG): container finished" podID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerID="8766c6c73cf28ccc576c83f82e20c6fe70a6664d9a7a04a13302eaacbca67f41" exitCode=1 Feb 24 00:19:03 crc kubenswrapper[4824]: I0224 00:19:03.949719 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"b1b50786-48f9-4f1a-bf8b-4686f9baae85","Type":"ContainerDied","Data":"8766c6c73cf28ccc576c83f82e20c6fe70a6664d9a7a04a13302eaacbca67f41"} Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.084246 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_b1b50786-48f9-4f1a-bf8b-4686f9baae85/docker-build/0.log" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.084942 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107661 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107718 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107795 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107829 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107876 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d647\" (UniqueName: \"kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.107927 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108046 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108108 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108155 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108191 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108264 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.108307 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache\") pod \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\" (UID: \"b1b50786-48f9-4f1a-bf8b-4686f9baae85\") " Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.109617 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.109687 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.109820 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.110759 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.111136 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.111159 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.112088 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.112368 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.112379 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.118757 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.119760 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.122745 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647" (OuterVolumeSpecName: "kube-api-access-8d647") pod "b1b50786-48f9-4f1a-bf8b-4686f9baae85" (UID: "b1b50786-48f9-4f1a-bf8b-4686f9baae85"). InnerVolumeSpecName "kube-api-access-8d647". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.209970 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210022 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210040 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210057 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210070 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d647\" (UniqueName: \"kubernetes.io/projected/b1b50786-48f9-4f1a-bf8b-4686f9baae85-kube-api-access-8d647\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210088 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210102 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210115 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b1b50786-48f9-4f1a-bf8b-4686f9baae85-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210128 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1b50786-48f9-4f1a-bf8b-4686f9baae85-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210143 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210157 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.210213 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1b50786-48f9-4f1a-bf8b-4686f9baae85-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.958824 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"cc20437c-c977-4543-a681-cda1af5c3583","Type":"ContainerStarted","Data":"57263a4947e6ca13016d819f5f6d5967c292ce21c193f314e05a87993146fc70"} Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.961561 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_b1b50786-48f9-4f1a-bf8b-4686f9baae85/docker-build/0.log" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.962001 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"b1b50786-48f9-4f1a-bf8b-4686f9baae85","Type":"ContainerDied","Data":"d08a403612b11c8bf91f5e3516b788a4c4942b41c820856155d377d7e8361b75"} Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.962059 4824 scope.go:117] "RemoveContainer" containerID="8766c6c73cf28ccc576c83f82e20c6fe70a6664d9a7a04a13302eaacbca67f41" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.962210 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Feb 24 00:19:04 crc kubenswrapper[4824]: I0224 00:19:04.986833 4824 scope.go:117] "RemoveContainer" containerID="ab94cc45a4691d159d5d48b21f724973db63d5bd87b6d0a0e4494040c5bc84b1" Feb 24 00:19:05 crc kubenswrapper[4824]: I0224 00:19:05.025136 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:19:05 crc kubenswrapper[4824]: E0224 00:19:05.046961 4824 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=4548264831213021485, SKID=, AKID=10:68:53:25:D1:3D:2E:E3:54:0D:95:0B:6A:90:F1:BD:A9:4F:27:6E failed: x509: certificate signed by unknown authority" Feb 24 00:19:05 crc kubenswrapper[4824]: I0224 00:19:05.048042 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Feb 24 00:19:06 crc kubenswrapper[4824]: I0224 00:19:06.084918 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:19:06 crc kubenswrapper[4824]: I0224 00:19:06.700910 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" path="/var/lib/kubelet/pods/b1b50786-48f9-4f1a-bf8b-4686f9baae85/volumes" Feb 24 00:19:06 crc kubenswrapper[4824]: I0224 00:19:06.975479 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-2-build" podUID="cc20437c-c977-4543-a681-cda1af5c3583" containerName="git-clone" containerID="cri-o://57263a4947e6ca13016d819f5f6d5967c292ce21c193f314e05a87993146fc70" gracePeriod=30 Feb 24 00:19:07 crc kubenswrapper[4824]: I0224 00:19:07.984705 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_cc20437c-c977-4543-a681-cda1af5c3583/git-clone/0.log" Feb 24 00:19:07 crc kubenswrapper[4824]: I0224 00:19:07.986417 4824 generic.go:334] "Generic (PLEG): container finished" podID="cc20437c-c977-4543-a681-cda1af5c3583" containerID="57263a4947e6ca13016d819f5f6d5967c292ce21c193f314e05a87993146fc70" exitCode=1 Feb 24 00:19:07 crc kubenswrapper[4824]: I0224 00:19:07.986508 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"cc20437c-c977-4543-a681-cda1af5c3583","Type":"ContainerDied","Data":"57263a4947e6ca13016d819f5f6d5967c292ce21c193f314e05a87993146fc70"} Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.211280 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.567004 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_cc20437c-c977-4543-a681-cda1af5c3583/git-clone/0.log" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.567126 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.590910 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.590991 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591079 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591117 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591150 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591183 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r7jd\" (UniqueName: \"kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591204 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591236 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591284 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591302 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591354 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591400 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache\") pod \"cc20437c-c977-4543-a681-cda1af5c3583\" (UID: \"cc20437c-c977-4543-a681-cda1af5c3583\") " Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591626 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591832 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591881 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.591965 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.592397 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.592499 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.592566 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.592607 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.592893 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.600960 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.601895 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd" (OuterVolumeSpecName: "kube-api-access-2r7jd") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "kube-api-access-2r7jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.603214 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "cc20437c-c977-4543-a681-cda1af5c3583" (UID: "cc20437c-c977-4543-a681-cda1af5c3583"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694431 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694707 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694792 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694808 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r7jd\" (UniqueName: \"kubernetes.io/projected/cc20437c-c977-4543-a681-cda1af5c3583-kube-api-access-2r7jd\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694820 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694833 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cc20437c-c977-4543-a681-cda1af5c3583-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694872 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694881 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694893 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694904 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cc20437c-c977-4543-a681-cda1af5c3583-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694914 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cc20437c-c977-4543-a681-cda1af5c3583-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.694923 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cc20437c-c977-4543-a681-cda1af5c3583-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.995759 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_cc20437c-c977-4543-a681-cda1af5c3583/git-clone/0.log" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.995832 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"cc20437c-c977-4543-a681-cda1af5c3583","Type":"ContainerDied","Data":"fd9408fdeee02b30a3fa42083cf9391e2cb143354c82385b1e3d293e48b613e4"} Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.995883 4824 scope.go:117] "RemoveContainer" containerID="57263a4947e6ca13016d819f5f6d5967c292ce21c193f314e05a87993146fc70" Feb 24 00:19:08 crc kubenswrapper[4824]: I0224 00:19:08.995925 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Feb 24 00:19:09 crc kubenswrapper[4824]: I0224 00:19:09.024717 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:19:09 crc kubenswrapper[4824]: I0224 00:19:09.032252 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Feb 24 00:19:10 crc kubenswrapper[4824]: I0224 00:19:10.703795 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc20437c-c977-4543-a681-cda1af5c3583" path="/var/lib/kubelet/pods/cc20437c-c977-4543-a681-cda1af5c3583/volumes" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.639914 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:17 crc kubenswrapper[4824]: E0224 00:19:17.642112 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="docker-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.642134 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="docker-build" Feb 24 00:19:17 crc kubenswrapper[4824]: E0224 00:19:17.642156 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc20437c-c977-4543-a681-cda1af5c3583" containerName="git-clone" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.642165 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc20437c-c977-4543-a681-cda1af5c3583" containerName="git-clone" Feb 24 00:19:17 crc kubenswrapper[4824]: E0224 00:19:17.642178 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="manage-dockerfile" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.642185 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="manage-dockerfile" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.642295 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc20437c-c977-4543-a681-cda1af5c3583" containerName="git-clone" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.642306 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b50786-48f9-4f1a-bf8b-4686f9baae85" containerName="docker-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.643327 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.648764 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.649004 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.649076 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.649486 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.668011 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.739836 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.739905 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.739935 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740079 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740113 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740153 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740181 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740220 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740248 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740281 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740309 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.740414 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvn47\" (UniqueName: \"kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.841931 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842046 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842083 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842124 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842214 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvn47\" (UniqueName: \"kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842273 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842347 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842404 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842502 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842589 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842647 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842776 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842799 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.842968 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843071 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843180 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843197 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843431 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843881 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.843962 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.844616 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.854277 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.854273 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.861328 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvn47\" (UniqueName: \"kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47\") pod \"service-telemetry-operator-3-build\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:17 crc kubenswrapper[4824]: I0224 00:19:17.962541 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:18 crc kubenswrapper[4824]: I0224 00:19:18.188604 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:19 crc kubenswrapper[4824]: I0224 00:19:19.063814 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"6c95f5c0-186a-4a4b-867d-88660b3edf1f","Type":"ContainerStarted","Data":"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad"} Feb 24 00:19:19 crc kubenswrapper[4824]: I0224 00:19:19.064293 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"6c95f5c0-186a-4a4b-867d-88660b3edf1f","Type":"ContainerStarted","Data":"d5eb4c33df13b7de6ce3c2cf8562f2a4d64e137c350b97d64bc6f16d1d0fafdc"} Feb 24 00:19:19 crc kubenswrapper[4824]: E0224 00:19:19.129360 4824 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=4548264831213021485, SKID=, AKID=10:68:53:25:D1:3D:2E:E3:54:0D:95:0B:6A:90:F1:BD:A9:4F:27:6E failed: x509: certificate signed by unknown authority" Feb 24 00:19:20 crc kubenswrapper[4824]: I0224 00:19:20.162149 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.078685 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-3-build" podUID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" containerName="git-clone" containerID="cri-o://38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad" gracePeriod=30 Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.439232 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_6c95f5c0-186a-4a4b-867d-88660b3edf1f/git-clone/0.log" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.439855 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.501186 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.501324 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.501355 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.501504 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.502960 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.502991 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvn47\" (UniqueName: \"kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503018 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503054 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503081 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503115 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503201 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503254 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir\") pod \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\" (UID: \"6c95f5c0-186a-4a4b-867d-88660b3edf1f\") " Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.502374 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.502427 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.502757 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503154 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503327 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503783 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.504066 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.504088 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.504103 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503571 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503641 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503772 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.503915 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.508242 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.508628 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47" (OuterVolumeSpecName: "kube-api-access-dvn47") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "kube-api-access-dvn47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.508697 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "6c95f5c0-186a-4a4b-867d-88660b3edf1f" (UID: "6c95f5c0-186a-4a4b-867d-88660b3edf1f"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606158 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606203 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606218 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c95f5c0-186a-4a4b-867d-88660b3edf1f-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606230 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/6c95f5c0-186a-4a4b-867d-88660b3edf1f-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606242 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606254 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c95f5c0-186a-4a4b-867d-88660b3edf1f-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606263 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvn47\" (UniqueName: \"kubernetes.io/projected/6c95f5c0-186a-4a4b-867d-88660b3edf1f-kube-api-access-dvn47\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:21 crc kubenswrapper[4824]: I0224 00:19:21.606272 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c95f5c0-186a-4a4b-867d-88660b3edf1f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087255 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_6c95f5c0-186a-4a4b-867d-88660b3edf1f/git-clone/0.log" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087312 4824 generic.go:334] "Generic (PLEG): container finished" podID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" containerID="38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad" exitCode=1 Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087351 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"6c95f5c0-186a-4a4b-867d-88660b3edf1f","Type":"ContainerDied","Data":"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad"} Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087388 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"6c95f5c0-186a-4a4b-867d-88660b3edf1f","Type":"ContainerDied","Data":"d5eb4c33df13b7de6ce3c2cf8562f2a4d64e137c350b97d64bc6f16d1d0fafdc"} Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087409 4824 scope.go:117] "RemoveContainer" containerID="38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.087558 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.115926 4824 scope.go:117] "RemoveContainer" containerID="38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad" Feb 24 00:19:22 crc kubenswrapper[4824]: E0224 00:19:22.118719 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad\": container with ID starting with 38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad not found: ID does not exist" containerID="38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.118783 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad"} err="failed to get container status \"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad\": rpc error: code = NotFound desc = could not find container \"38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad\": container with ID starting with 38e58f7cc777ad190c4f6b480752e9102012919dd4b65943269d93402ea90dad not found: ID does not exist" Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.126700 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.132138 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Feb 24 00:19:22 crc kubenswrapper[4824]: I0224 00:19:22.706487 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" path="/var/lib/kubelet/pods/6c95f5c0-186a-4a4b-867d-88660b3edf1f/volumes" Feb 24 00:19:23 crc kubenswrapper[4824]: I0224 00:19:23.276659 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:19:23 crc kubenswrapper[4824]: I0224 00:19:23.276752 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.536519 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:31 crc kubenswrapper[4824]: E0224 00:19:31.537409 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" containerName="git-clone" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.537430 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" containerName="git-clone" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.537647 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c95f5c0-186a-4a4b-867d-88660b3edf1f" containerName="git-clone" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.539041 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.541699 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.541986 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.542476 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.544078 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.551663 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572592 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572645 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572683 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572711 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572729 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572767 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572788 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7p5t\" (UniqueName: \"kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572813 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572835 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.572867 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.573124 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.573229 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.674291 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.674349 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.674394 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.674492 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675014 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675270 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675476 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675552 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675598 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675718 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675743 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7p5t\" (UniqueName: \"kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675754 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675822 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675846 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.675907 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.676011 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.676069 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.676293 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.676698 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.677030 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.677578 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.681762 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.682653 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.695145 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7p5t\" (UniqueName: \"kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t\") pod \"service-telemetry-operator-4-build\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:31 crc kubenswrapper[4824]: I0224 00:19:31.860248 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:32 crc kubenswrapper[4824]: I0224 00:19:32.090621 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:32 crc kubenswrapper[4824]: I0224 00:19:32.162205 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"cb882313-5084-4bd0-b5aa-25322ecd66ac","Type":"ContainerStarted","Data":"805b4c2c99f251c12585542f59ae630605d117b692e0b9a0ce96358656635812"} Feb 24 00:19:33 crc kubenswrapper[4824]: I0224 00:19:33.173906 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"cb882313-5084-4bd0-b5aa-25322ecd66ac","Type":"ContainerStarted","Data":"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed"} Feb 24 00:19:33 crc kubenswrapper[4824]: E0224 00:19:33.249903 4824 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=4548264831213021485, SKID=, AKID=10:68:53:25:D1:3D:2E:E3:54:0D:95:0B:6A:90:F1:BD:A9:4F:27:6E failed: x509: certificate signed by unknown authority" Feb 24 00:19:34 crc kubenswrapper[4824]: I0224 00:19:34.281693 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.186671 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-4-build" podUID="cb882313-5084-4bd0-b5aa-25322ecd66ac" containerName="git-clone" containerID="cri-o://1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed" gracePeriod=30 Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.582379 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_cb882313-5084-4bd0-b5aa-25322ecd66ac/git-clone/0.log" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.582828 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642311 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642376 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642453 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642622 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7p5t\" (UniqueName: \"kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642655 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642694 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642719 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642751 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642796 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642819 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642866 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.642895 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir\") pod \"cb882313-5084-4bd0-b5aa-25322ecd66ac\" (UID: \"cb882313-5084-4bd0-b5aa-25322ecd66ac\") " Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.643242 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.643233 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.643668 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.643775 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.644000 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.644355 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.644477 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.644662 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.644754 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.652731 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.652774 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.652760 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t" (OuterVolumeSpecName: "kube-api-access-p7p5t") pod "cb882313-5084-4bd0-b5aa-25322ecd66ac" (UID: "cb882313-5084-4bd0-b5aa-25322ecd66ac"). InnerVolumeSpecName "kube-api-access-p7p5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744907 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744953 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7p5t\" (UniqueName: \"kubernetes.io/projected/cb882313-5084-4bd0-b5aa-25322ecd66ac-kube-api-access-p7p5t\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744965 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744974 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744983 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.744993 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.745158 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cb882313-5084-4bd0-b5aa-25322ecd66ac-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.745869 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.745942 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.745970 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cb882313-5084-4bd0-b5aa-25322ecd66ac-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.746028 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb882313-5084-4bd0-b5aa-25322ecd66ac-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:35 crc kubenswrapper[4824]: I0224 00:19:35.746052 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/cb882313-5084-4bd0-b5aa-25322ecd66ac-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.193870 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_cb882313-5084-4bd0-b5aa-25322ecd66ac/git-clone/0.log" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.194208 4824 generic.go:334] "Generic (PLEG): container finished" podID="cb882313-5084-4bd0-b5aa-25322ecd66ac" containerID="1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed" exitCode=1 Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.194337 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"cb882313-5084-4bd0-b5aa-25322ecd66ac","Type":"ContainerDied","Data":"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed"} Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.194409 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.194454 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"cb882313-5084-4bd0-b5aa-25322ecd66ac","Type":"ContainerDied","Data":"805b4c2c99f251c12585542f59ae630605d117b692e0b9a0ce96358656635812"} Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.194497 4824 scope.go:117] "RemoveContainer" containerID="1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.232169 4824 scope.go:117] "RemoveContainer" containerID="1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed" Feb 24 00:19:36 crc kubenswrapper[4824]: E0224 00:19:36.233078 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed\": container with ID starting with 1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed not found: ID does not exist" containerID="1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.233117 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed"} err="failed to get container status \"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed\": rpc error: code = NotFound desc = could not find container \"1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed\": container with ID starting with 1bc4d1fc862ebef78c7ee2786774d413f91471e8c4b13179ec6808d34cada1ed not found: ID does not exist" Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.235113 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.242956 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Feb 24 00:19:36 crc kubenswrapper[4824]: I0224 00:19:36.704892 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb882313-5084-4bd0-b5aa-25322ecd66ac" path="/var/lib/kubelet/pods/cb882313-5084-4bd0-b5aa-25322ecd66ac/volumes" Feb 24 00:19:43 crc kubenswrapper[4824]: I0224 00:19:43.475947 4824 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.647132 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 24 00:19:45 crc kubenswrapper[4824]: E0224 00:19:45.647690 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb882313-5084-4bd0-b5aa-25322ecd66ac" containerName="git-clone" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.647704 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb882313-5084-4bd0-b5aa-25322ecd66ac" containerName="git-clone" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.647832 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb882313-5084-4bd0-b5aa-25322ecd66ac" containerName="git-clone" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.648643 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.653259 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.654135 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.654392 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.654804 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.673966 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829476 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829592 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829700 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829770 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829809 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829844 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829881 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829944 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.829970 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdvw\" (UniqueName: \"kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.830026 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.830070 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.830088 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.931835 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.931920 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.931949 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.931981 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932009 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932042 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932067 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932083 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932101 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932128 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932150 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdvw\" (UniqueName: \"kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932166 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932182 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932589 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932612 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932807 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.932975 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.933105 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.933443 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.933728 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.934276 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.945466 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.946057 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.954606 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdvw\" (UniqueName: \"kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw\") pod \"service-telemetry-operator-5-build\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:45 crc kubenswrapper[4824]: I0224 00:19:45.966567 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:19:46 crc kubenswrapper[4824]: I0224 00:19:46.192861 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Feb 24 00:19:46 crc kubenswrapper[4824]: I0224 00:19:46.270073 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerStarted","Data":"44273c89e677ea30f0737c8184f478f3798ea6ee49c1c3b6c1308da56ccd9ed5"} Feb 24 00:19:47 crc kubenswrapper[4824]: I0224 00:19:47.277950 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerStarted","Data":"c64d5595559c578057b819d0d0c20e76d9b8f63ede84abcd57bbd5cb763e2039"} Feb 24 00:19:53 crc kubenswrapper[4824]: I0224 00:19:53.275982 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:19:53 crc kubenswrapper[4824]: I0224 00:19:53.277204 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:19:53 crc kubenswrapper[4824]: I0224 00:19:53.277278 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:19:53 crc kubenswrapper[4824]: I0224 00:19:53.278559 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:19:53 crc kubenswrapper[4824]: I0224 00:19:53.278625 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3" gracePeriod=600 Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.340436 4824 generic.go:334] "Generic (PLEG): container finished" podID="7e61a502-d012-4bfe-9788-440f95e757cf" containerID="c64d5595559c578057b819d0d0c20e76d9b8f63ede84abcd57bbd5cb763e2039" exitCode=0 Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.340603 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerDied","Data":"c64d5595559c578057b819d0d0c20e76d9b8f63ede84abcd57bbd5cb763e2039"} Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.347490 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3" exitCode=0 Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.347934 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3"} Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.347970 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28"} Feb 24 00:19:54 crc kubenswrapper[4824]: I0224 00:19:54.347994 4824 scope.go:117] "RemoveContainer" containerID="14f28b64a526a9334cfaacd13a3a23756d3ea46670a60bcfe695a7e80551056e" Feb 24 00:19:55 crc kubenswrapper[4824]: I0224 00:19:55.358405 4824 generic.go:334] "Generic (PLEG): container finished" podID="7e61a502-d012-4bfe-9788-440f95e757cf" containerID="ba082b7917d1ad0a20e76bc29b76ce7ae092dfc546fb0ff65abd6f53c2cf70f9" exitCode=0 Feb 24 00:19:55 crc kubenswrapper[4824]: I0224 00:19:55.358681 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerDied","Data":"ba082b7917d1ad0a20e76bc29b76ce7ae092dfc546fb0ff65abd6f53c2cf70f9"} Feb 24 00:19:55 crc kubenswrapper[4824]: I0224 00:19:55.404597 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_7e61a502-d012-4bfe-9788-440f95e757cf/manage-dockerfile/0.log" Feb 24 00:19:56 crc kubenswrapper[4824]: I0224 00:19:56.380304 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerStarted","Data":"17c294022e78aa9014d3c492430d5e04773952f0a4b192822b172553c6c46c17"} Feb 24 00:19:56 crc kubenswrapper[4824]: I0224 00:19:56.417093 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-5-build" podStartSLOduration=11.417058839 podStartE2EDuration="11.417058839s" podCreationTimestamp="2026-02-24 00:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:19:56.406247568 +0000 UTC m=+860.395872057" watchObservedRunningTime="2026-02-24 00:19:56.417058839 +0000 UTC m=+860.406683308" Feb 24 00:21:25 crc kubenswrapper[4824]: I0224 00:21:25.991552 4824 generic.go:334] "Generic (PLEG): container finished" podID="7e61a502-d012-4bfe-9788-440f95e757cf" containerID="17c294022e78aa9014d3c492430d5e04773952f0a4b192822b172553c6c46c17" exitCode=0 Feb 24 00:21:25 crc kubenswrapper[4824]: I0224 00:21:25.991616 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerDied","Data":"17c294022e78aa9014d3c492430d5e04773952f0a4b192822b172553c6c46c17"} Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.251829 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325024 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkdvw\" (UniqueName: \"kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325124 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325177 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325211 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325244 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325278 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325314 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325342 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325365 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325398 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325455 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.325482 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir\") pod \"7e61a502-d012-4bfe-9788-440f95e757cf\" (UID: \"7e61a502-d012-4bfe-9788-440f95e757cf\") " Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.326574 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.326624 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.326661 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.327833 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.327871 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.327859 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.336852 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.336969 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw" (OuterVolumeSpecName: "kube-api-access-fkdvw") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "kube-api-access-fkdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.337107 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.366462 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427364 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427408 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427425 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427437 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427450 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427462 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkdvw\" (UniqueName: \"kubernetes.io/projected/7e61a502-d012-4bfe-9788-440f95e757cf-kube-api-access-fkdvw\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427472 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427485 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7e61a502-d012-4bfe-9788-440f95e757cf-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427496 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7e61a502-d012-4bfe-9788-440f95e757cf-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.427508 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7e61a502-d012-4bfe-9788-440f95e757cf-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.516274 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:27 crc kubenswrapper[4824]: I0224 00:21:27.529110 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:28 crc kubenswrapper[4824]: I0224 00:21:28.008174 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"7e61a502-d012-4bfe-9788-440f95e757cf","Type":"ContainerDied","Data":"44273c89e677ea30f0737c8184f478f3798ea6ee49c1c3b6c1308da56ccd9ed5"} Feb 24 00:21:28 crc kubenswrapper[4824]: I0224 00:21:28.008226 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44273c89e677ea30f0737c8184f478f3798ea6ee49c1c3b6c1308da56ccd9ed5" Feb 24 00:21:28 crc kubenswrapper[4824]: I0224 00:21:28.008316 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Feb 24 00:21:29 crc kubenswrapper[4824]: I0224 00:21:29.188741 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7e61a502-d012-4bfe-9788-440f95e757cf" (UID: "7e61a502-d012-4bfe-9788-440f95e757cf"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:29 crc kubenswrapper[4824]: I0224 00:21:29.255443 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7e61a502-d012-4bfe-9788-440f95e757cf-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.291208 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:31 crc kubenswrapper[4824]: E0224 00:21:31.291541 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="manage-dockerfile" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.291553 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="manage-dockerfile" Feb 24 00:21:31 crc kubenswrapper[4824]: E0224 00:21:31.291564 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="git-clone" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.291571 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="git-clone" Feb 24 00:21:31 crc kubenswrapper[4824]: E0224 00:21:31.291581 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="docker-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.291588 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="docker-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.291743 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e61a502-d012-4bfe-9788-440f95e757cf" containerName="docker-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.292775 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.296546 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.297919 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.299349 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.301388 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.313871 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389598 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389668 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389703 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389785 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389821 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7znl\" (UniqueName: \"kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389847 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389869 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.389890 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.390024 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.390128 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.390279 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.390318 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492506 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492596 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7znl\" (UniqueName: \"kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492638 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492669 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492693 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492726 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492764 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492820 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492855 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492915 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492944 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.492973 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.493491 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.493549 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.493747 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494031 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494196 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494640 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494693 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494930 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.494992 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.502302 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.502320 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.514437 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7znl\" (UniqueName: \"kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl\") pod \"smart-gateway-operator-1-build\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.615095 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:31 crc kubenswrapper[4824]: I0224 00:21:31.868215 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:32 crc kubenswrapper[4824]: I0224 00:21:32.044001 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"7c94136e-3210-48f4-bd2c-cbfb25d117b6","Type":"ContainerStarted","Data":"38ae562907291985af4b9d5f5cf7656a88ee4ee3a1caf482cc7dab2a3e8b6f2c"} Feb 24 00:21:33 crc kubenswrapper[4824]: I0224 00:21:33.052317 4824 generic.go:334] "Generic (PLEG): container finished" podID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerID="a826809630bf7ea89dbd405b13377d29a8fc6004a50e897939509359da21749a" exitCode=0 Feb 24 00:21:33 crc kubenswrapper[4824]: I0224 00:21:33.052790 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"7c94136e-3210-48f4-bd2c-cbfb25d117b6","Type":"ContainerDied","Data":"a826809630bf7ea89dbd405b13377d29a8fc6004a50e897939509359da21749a"} Feb 24 00:21:34 crc kubenswrapper[4824]: I0224 00:21:34.062982 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"7c94136e-3210-48f4-bd2c-cbfb25d117b6","Type":"ContainerStarted","Data":"39243a21b42b77e3a813ff45c21060c950e9e46e56e45b01ee3b7c5543bc3f0d"} Feb 24 00:21:41 crc kubenswrapper[4824]: I0224 00:21:41.723493 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=10.723474433 podStartE2EDuration="10.723474433s" podCreationTimestamp="2026-02-24 00:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:21:34.092099572 +0000 UTC m=+958.081724041" watchObservedRunningTime="2026-02-24 00:21:41.723474433 +0000 UTC m=+965.713098902" Feb 24 00:21:41 crc kubenswrapper[4824]: I0224 00:21:41.728595 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:41 crc kubenswrapper[4824]: I0224 00:21:41.728818 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="docker-build" containerID="cri-o://39243a21b42b77e3a813ff45c21060c950e9e46e56e45b01ee3b7c5543bc3f0d" gracePeriod=30 Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.381024 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.382965 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.385736 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.386029 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.386204 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.405854 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.493990 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.494431 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.494547 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.494703 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.494798 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.495831 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.495908 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jr7\" (UniqueName: \"kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.496068 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.496159 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.496213 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.496270 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.496290 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598473 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598568 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598616 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598641 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jr7\" (UniqueName: \"kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598676 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598709 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598740 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598773 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598796 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598824 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598880 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.598913 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.599369 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.599400 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.599601 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.599633 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.599754 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.600269 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.600457 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.600660 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.600906 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.610731 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.611009 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.636694 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jr7\" (UniqueName: \"kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7\") pod \"smart-gateway-operator-2-build\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.705966 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:21:43 crc kubenswrapper[4824]: I0224 00:21:43.928127 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Feb 24 00:21:44 crc kubenswrapper[4824]: I0224 00:21:44.158075 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerStarted","Data":"360ddab163ca47c1a0fb0338c2f478a6b9ea9e3e2cf8ffd6966eab886336bcb9"} Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.270295 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_7c94136e-3210-48f4-bd2c-cbfb25d117b6/docker-build/0.log" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.271625 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.288415 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_7c94136e-3210-48f4-bd2c-cbfb25d117b6/docker-build/0.log" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.289234 4824 generic.go:334] "Generic (PLEG): container finished" podID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerID="39243a21b42b77e3a813ff45c21060c950e9e46e56e45b01ee3b7c5543bc3f0d" exitCode=1 Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.289287 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"7c94136e-3210-48f4-bd2c-cbfb25d117b6","Type":"ContainerDied","Data":"39243a21b42b77e3a813ff45c21060c950e9e46e56e45b01ee3b7c5543bc3f0d"} Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.289327 4824 scope.go:117] "RemoveContainer" containerID="39243a21b42b77e3a813ff45c21060c950e9e46e56e45b01ee3b7c5543bc3f0d" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360322 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360392 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360431 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360494 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360573 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360611 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360639 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7znl\" (UniqueName: \"kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360720 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360775 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360799 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360826 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.360850 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles\") pod \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\" (UID: \"7c94136e-3210-48f4-bd2c-cbfb25d117b6\") " Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.362175 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.362336 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.362377 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.362524 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.362561 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.363305 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.378151 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.387040 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.387088 4824 scope.go:117] "RemoveContainer" containerID="a826809630bf7ea89dbd405b13377d29a8fc6004a50e897939509359da21749a" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.387091 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl" (OuterVolumeSpecName: "kube-api-access-n7znl") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "kube-api-access-n7znl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.392883 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462632 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462693 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462706 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462716 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462728 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462738 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462750 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c94136e-3210-48f4-bd2c-cbfb25d117b6-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462762 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/7c94136e-3210-48f4-bd2c-cbfb25d117b6-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462772 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7znl\" (UniqueName: \"kubernetes.io/projected/7c94136e-3210-48f4-bd2c-cbfb25d117b6-kube-api-access-n7znl\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.462781 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.763689 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.768033 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.770723 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7c94136e-3210-48f4-bd2c-cbfb25d117b6" (UID: "7c94136e-3210-48f4-bd2c-cbfb25d117b6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:21:47 crc kubenswrapper[4824]: I0224 00:21:47.869511 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c94136e-3210-48f4-bd2c-cbfb25d117b6-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.297902 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.297894 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"7c94136e-3210-48f4-bd2c-cbfb25d117b6","Type":"ContainerDied","Data":"38ae562907291985af4b9d5f5cf7656a88ee4ee3a1caf482cc7dab2a3e8b6f2c"} Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.302342 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerStarted","Data":"45ae95c2644ed487ab9447779612d358e7e091ad8916d58155d5c783400a2e9d"} Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.378036 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.388247 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Feb 24 00:21:48 crc kubenswrapper[4824]: E0224 00:21:48.388690 4824 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c94136e_3210_48f4_bd2c_cbfb25d117b6.slice/crio-38ae562907291985af4b9d5f5cf7656a88ee4ee3a1caf482cc7dab2a3e8b6f2c\": RecentStats: unable to find data in memory cache]" Feb 24 00:21:48 crc kubenswrapper[4824]: I0224 00:21:48.702378 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" path="/var/lib/kubelet/pods/7c94136e-3210-48f4-bd2c-cbfb25d117b6/volumes" Feb 24 00:21:49 crc kubenswrapper[4824]: I0224 00:21:49.311755 4824 generic.go:334] "Generic (PLEG): container finished" podID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerID="45ae95c2644ed487ab9447779612d358e7e091ad8916d58155d5c783400a2e9d" exitCode=0 Feb 24 00:21:49 crc kubenswrapper[4824]: I0224 00:21:49.311833 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerDied","Data":"45ae95c2644ed487ab9447779612d358e7e091ad8916d58155d5c783400a2e9d"} Feb 24 00:21:50 crc kubenswrapper[4824]: I0224 00:21:50.321879 4824 generic.go:334] "Generic (PLEG): container finished" podID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerID="cd0d73920f56d7da26bfcf6fd489e39d2a6046d8d8b6ab81f5c0c605a8da38fd" exitCode=0 Feb 24 00:21:50 crc kubenswrapper[4824]: I0224 00:21:50.322026 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerDied","Data":"cd0d73920f56d7da26bfcf6fd489e39d2a6046d8d8b6ab81f5c0c605a8da38fd"} Feb 24 00:21:50 crc kubenswrapper[4824]: I0224 00:21:50.367343 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_87c4f157-f66c-485a-b29d-db482b59c2a1/manage-dockerfile/0.log" Feb 24 00:21:51 crc kubenswrapper[4824]: I0224 00:21:51.331625 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerStarted","Data":"9aa60fad0c5e5b125c85a695ee9699aa7edfa41691acaf3323a949446eac0c81"} Feb 24 00:21:51 crc kubenswrapper[4824]: I0224 00:21:51.367262 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=8.367238147 podStartE2EDuration="8.367238147s" podCreationTimestamp="2026-02-24 00:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:21:51.360443943 +0000 UTC m=+975.350068412" watchObservedRunningTime="2026-02-24 00:21:51.367238147 +0000 UTC m=+975.356862626" Feb 24 00:21:53 crc kubenswrapper[4824]: I0224 00:21:53.276358 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:21:53 crc kubenswrapper[4824]: I0224 00:21:53.276693 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.612023 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:21:56 crc kubenswrapper[4824]: E0224 00:21:56.612434 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="manage-dockerfile" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.612457 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="manage-dockerfile" Feb 24 00:21:56 crc kubenswrapper[4824]: E0224 00:21:56.612482 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="docker-build" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.612496 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="docker-build" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.612705 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c94136e-3210-48f4-bd2c-cbfb25d117b6" containerName="docker-build" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.614046 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.630204 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.713073 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.713703 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.713909 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7jv\" (UniqueName: \"kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.815854 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.815922 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.815969 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7jv\" (UniqueName: \"kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.817231 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.817752 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.843627 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7jv\" (UniqueName: \"kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv\") pod \"redhat-operators-q2g4m\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:56 crc kubenswrapper[4824]: I0224 00:21:56.940223 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:21:57 crc kubenswrapper[4824]: I0224 00:21:57.385456 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:21:57 crc kubenswrapper[4824]: W0224 00:21:57.391725 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e34f5eb_6bdc_406e_aaed_ff979c64f6db.slice/crio-c154e571981648f7115b3e1d513a6874710dbdb0e4c9bd37e0e746f9c26709ae WatchSource:0}: Error finding container c154e571981648f7115b3e1d513a6874710dbdb0e4c9bd37e0e746f9c26709ae: Status 404 returned error can't find the container with id c154e571981648f7115b3e1d513a6874710dbdb0e4c9bd37e0e746f9c26709ae Feb 24 00:21:58 crc kubenswrapper[4824]: I0224 00:21:58.381495 4824 generic.go:334] "Generic (PLEG): container finished" podID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerID="465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2" exitCode=0 Feb 24 00:21:58 crc kubenswrapper[4824]: I0224 00:21:58.381574 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerDied","Data":"465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2"} Feb 24 00:21:58 crc kubenswrapper[4824]: I0224 00:21:58.381643 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerStarted","Data":"c154e571981648f7115b3e1d513a6874710dbdb0e4c9bd37e0e746f9c26709ae"} Feb 24 00:21:59 crc kubenswrapper[4824]: I0224 00:21:59.391502 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerStarted","Data":"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7"} Feb 24 00:22:00 crc kubenswrapper[4824]: I0224 00:22:00.405931 4824 generic.go:334] "Generic (PLEG): container finished" podID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerID="5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7" exitCode=0 Feb 24 00:22:00 crc kubenswrapper[4824]: I0224 00:22:00.405992 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerDied","Data":"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7"} Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.392675 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.395363 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.408928 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.420375 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.420441 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.420474 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vxp\" (UniqueName: \"kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.521697 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vxp\" (UniqueName: \"kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.521877 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.521915 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.522801 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.522807 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.550828 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vxp\" (UniqueName: \"kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp\") pod \"community-operators-lffjr\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:09 crc kubenswrapper[4824]: I0224 00:22:09.714836 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:10 crc kubenswrapper[4824]: I0224 00:22:10.465945 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:10 crc kubenswrapper[4824]: I0224 00:22:10.480844 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerStarted","Data":"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62"} Feb 24 00:22:11 crc kubenswrapper[4824]: I0224 00:22:11.488602 4824 generic.go:334] "Generic (PLEG): container finished" podID="5daf2179-5386-4221-a15d-0e9787959357" containerID="4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46" exitCode=0 Feb 24 00:22:11 crc kubenswrapper[4824]: I0224 00:22:11.488675 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerDied","Data":"4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46"} Feb 24 00:22:11 crc kubenswrapper[4824]: I0224 00:22:11.489256 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerStarted","Data":"fd0d534e6cc3fbc4eb96094ad2d6f52ebe74dd8f3e98453192847a75f0ffa856"} Feb 24 00:22:11 crc kubenswrapper[4824]: I0224 00:22:11.537879 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q2g4m" podStartSLOduration=3.719070388 podStartE2EDuration="15.537849884s" podCreationTimestamp="2026-02-24 00:21:56 +0000 UTC" firstStartedPulling="2026-02-24 00:21:58.383782432 +0000 UTC m=+982.373406901" lastFinishedPulling="2026-02-24 00:22:10.202561928 +0000 UTC m=+994.192186397" observedRunningTime="2026-02-24 00:22:11.536296975 +0000 UTC m=+995.525921474" watchObservedRunningTime="2026-02-24 00:22:11.537849884 +0000 UTC m=+995.527474353" Feb 24 00:22:14 crc kubenswrapper[4824]: I0224 00:22:14.513014 4824 generic.go:334] "Generic (PLEG): container finished" podID="5daf2179-5386-4221-a15d-0e9787959357" containerID="e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0" exitCode=0 Feb 24 00:22:14 crc kubenswrapper[4824]: I0224 00:22:14.513227 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerDied","Data":"e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0"} Feb 24 00:22:15 crc kubenswrapper[4824]: I0224 00:22:15.523860 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerStarted","Data":"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c"} Feb 24 00:22:15 crc kubenswrapper[4824]: I0224 00:22:15.543306 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lffjr" podStartSLOduration=3.038876405 podStartE2EDuration="6.543281105s" podCreationTimestamp="2026-02-24 00:22:09 +0000 UTC" firstStartedPulling="2026-02-24 00:22:11.490779743 +0000 UTC m=+995.480404212" lastFinishedPulling="2026-02-24 00:22:14.995184443 +0000 UTC m=+998.984808912" observedRunningTime="2026-02-24 00:22:15.542823973 +0000 UTC m=+999.532448462" watchObservedRunningTime="2026-02-24 00:22:15.543281105 +0000 UTC m=+999.532905564" Feb 24 00:22:16 crc kubenswrapper[4824]: I0224 00:22:16.941388 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:16 crc kubenswrapper[4824]: I0224 00:22:16.941950 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:17 crc kubenswrapper[4824]: I0224 00:22:17.995626 4824 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q2g4m" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="registry-server" probeResult="failure" output=< Feb 24 00:22:17 crc kubenswrapper[4824]: timeout: failed to connect service ":50051" within 1s Feb 24 00:22:17 crc kubenswrapper[4824]: > Feb 24 00:22:19 crc kubenswrapper[4824]: I0224 00:22:19.716189 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:19 crc kubenswrapper[4824]: I0224 00:22:19.717027 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:19 crc kubenswrapper[4824]: I0224 00:22:19.756545 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:20 crc kubenswrapper[4824]: I0224 00:22:20.595006 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:20 crc kubenswrapper[4824]: I0224 00:22:20.654125 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.576252 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lffjr" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="registry-server" containerID="cri-o://41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c" gracePeriod=2 Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.727123 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.735042 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.746405 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.755565 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.755673 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.755788 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcs5\" (UniqueName: \"kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.857637 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.857745 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcs5\" (UniqueName: \"kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.857781 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.858387 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.858434 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:22 crc kubenswrapper[4824]: I0224 00:22:22.884375 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcs5\" (UniqueName: \"kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5\") pod \"certified-operators-lzf7f\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.062890 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.275566 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.276138 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.338270 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.377610 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities\") pod \"5daf2179-5386-4221-a15d-0e9787959357\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.379741 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities" (OuterVolumeSpecName: "utilities") pod "5daf2179-5386-4221-a15d-0e9787959357" (UID: "5daf2179-5386-4221-a15d-0e9787959357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.380188 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content\") pod \"5daf2179-5386-4221-a15d-0e9787959357\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.380283 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4vxp\" (UniqueName: \"kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp\") pod \"5daf2179-5386-4221-a15d-0e9787959357\" (UID: \"5daf2179-5386-4221-a15d-0e9787959357\") " Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.380724 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.391410 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp" (OuterVolumeSpecName: "kube-api-access-z4vxp") pod "5daf2179-5386-4221-a15d-0e9787959357" (UID: "5daf2179-5386-4221-a15d-0e9787959357"). InnerVolumeSpecName "kube-api-access-z4vxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.460098 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5daf2179-5386-4221-a15d-0e9787959357" (UID: "5daf2179-5386-4221-a15d-0e9787959357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.482499 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5daf2179-5386-4221-a15d-0e9787959357-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.482585 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4vxp\" (UniqueName: \"kubernetes.io/projected/5daf2179-5386-4221-a15d-0e9787959357-kube-api-access-z4vxp\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.586647 4824 generic.go:334] "Generic (PLEG): container finished" podID="5daf2179-5386-4221-a15d-0e9787959357" containerID="41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c" exitCode=0 Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.586714 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerDied","Data":"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c"} Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.586750 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lffjr" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.586783 4824 scope.go:117] "RemoveContainer" containerID="41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.586766 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lffjr" event={"ID":"5daf2179-5386-4221-a15d-0e9787959357","Type":"ContainerDied","Data":"fd0d534e6cc3fbc4eb96094ad2d6f52ebe74dd8f3e98453192847a75f0ffa856"} Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.607508 4824 scope.go:117] "RemoveContainer" containerID="e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.623013 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.632314 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lffjr"] Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.640906 4824 scope.go:117] "RemoveContainer" containerID="4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.642564 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.661352 4824 scope.go:117] "RemoveContainer" containerID="41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c" Feb 24 00:22:23 crc kubenswrapper[4824]: E0224 00:22:23.661817 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c\": container with ID starting with 41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c not found: ID does not exist" containerID="41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.661887 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c"} err="failed to get container status \"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c\": rpc error: code = NotFound desc = could not find container \"41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c\": container with ID starting with 41227a05a63831cc63a8bba2e9c77df4d85286d5e66dfda245d5fa542a2ef56c not found: ID does not exist" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.661940 4824 scope.go:117] "RemoveContainer" containerID="e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0" Feb 24 00:22:23 crc kubenswrapper[4824]: E0224 00:22:23.665802 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0\": container with ID starting with e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0 not found: ID does not exist" containerID="e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.665846 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0"} err="failed to get container status \"e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0\": rpc error: code = NotFound desc = could not find container \"e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0\": container with ID starting with e9f8f2ac56a9a9220e0f2fcf3de55cdda8261bdd3eb977d2fca357bf9c03b9e0 not found: ID does not exist" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.665874 4824 scope.go:117] "RemoveContainer" containerID="4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46" Feb 24 00:22:23 crc kubenswrapper[4824]: E0224 00:22:23.666093 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46\": container with ID starting with 4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46 not found: ID does not exist" containerID="4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46" Feb 24 00:22:23 crc kubenswrapper[4824]: I0224 00:22:23.666119 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46"} err="failed to get container status \"4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46\": rpc error: code = NotFound desc = could not find container \"4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46\": container with ID starting with 4323a297be2acca3f04764d788a2723c01de5977be24510091e6dae18368cf46 not found: ID does not exist" Feb 24 00:22:24 crc kubenswrapper[4824]: I0224 00:22:24.598732 4824 generic.go:334] "Generic (PLEG): container finished" podID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerID="1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460" exitCode=0 Feb 24 00:22:24 crc kubenswrapper[4824]: I0224 00:22:24.598793 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerDied","Data":"1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460"} Feb 24 00:22:24 crc kubenswrapper[4824]: I0224 00:22:24.598823 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerStarted","Data":"f8e29f08a92667cb46b4ad4c4f5b0ccad1357b1ed5c00229f2037079821e8657"} Feb 24 00:22:24 crc kubenswrapper[4824]: I0224 00:22:24.703684 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5daf2179-5386-4221-a15d-0e9787959357" path="/var/lib/kubelet/pods/5daf2179-5386-4221-a15d-0e9787959357/volumes" Feb 24 00:22:25 crc kubenswrapper[4824]: I0224 00:22:25.611925 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerStarted","Data":"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929"} Feb 24 00:22:26 crc kubenswrapper[4824]: I0224 00:22:26.622006 4824 generic.go:334] "Generic (PLEG): container finished" podID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerID="8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929" exitCode=0 Feb 24 00:22:26 crc kubenswrapper[4824]: I0224 00:22:26.622046 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerDied","Data":"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929"} Feb 24 00:22:27 crc kubenswrapper[4824]: I0224 00:22:27.047058 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:27 crc kubenswrapper[4824]: I0224 00:22:27.102207 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:27 crc kubenswrapper[4824]: I0224 00:22:27.631072 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerStarted","Data":"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa"} Feb 24 00:22:27 crc kubenswrapper[4824]: I0224 00:22:27.650819 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lzf7f" podStartSLOduration=3.210130608 podStartE2EDuration="5.650799192s" podCreationTimestamp="2026-02-24 00:22:22 +0000 UTC" firstStartedPulling="2026-02-24 00:22:24.600730401 +0000 UTC m=+1008.590354870" lastFinishedPulling="2026-02-24 00:22:27.041398985 +0000 UTC m=+1011.031023454" observedRunningTime="2026-02-24 00:22:27.649740655 +0000 UTC m=+1011.639365144" watchObservedRunningTime="2026-02-24 00:22:27.650799192 +0000 UTC m=+1011.640423681" Feb 24 00:22:29 crc kubenswrapper[4824]: I0224 00:22:29.595735 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:22:29 crc kubenswrapper[4824]: I0224 00:22:29.596486 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q2g4m" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="registry-server" containerID="cri-o://831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62" gracePeriod=2 Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.496048 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.596893 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities\") pod \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.597041 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7jv\" (UniqueName: \"kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv\") pod \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.597083 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content\") pod \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\" (UID: \"5e34f5eb-6bdc-406e-aaed-ff979c64f6db\") " Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.597801 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities" (OuterVolumeSpecName: "utilities") pod "5e34f5eb-6bdc-406e-aaed-ff979c64f6db" (UID: "5e34f5eb-6bdc-406e-aaed-ff979c64f6db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.604012 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv" (OuterVolumeSpecName: "kube-api-access-lq7jv") pod "5e34f5eb-6bdc-406e-aaed-ff979c64f6db" (UID: "5e34f5eb-6bdc-406e-aaed-ff979c64f6db"). InnerVolumeSpecName "kube-api-access-lq7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.650058 4824 generic.go:334] "Generic (PLEG): container finished" podID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerID="831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62" exitCode=0 Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.650111 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerDied","Data":"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62"} Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.650144 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q2g4m" event={"ID":"5e34f5eb-6bdc-406e-aaed-ff979c64f6db","Type":"ContainerDied","Data":"c154e571981648f7115b3e1d513a6874710dbdb0e4c9bd37e0e746f9c26709ae"} Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.650165 4824 scope.go:117] "RemoveContainer" containerID="831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.650291 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q2g4m" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.668198 4824 scope.go:117] "RemoveContainer" containerID="5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.698133 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.698165 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7jv\" (UniqueName: \"kubernetes.io/projected/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-kube-api-access-lq7jv\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.705947 4824 scope.go:117] "RemoveContainer" containerID="465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.723180 4824 scope.go:117] "RemoveContainer" containerID="831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62" Feb 24 00:22:30 crc kubenswrapper[4824]: E0224 00:22:30.723744 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62\": container with ID starting with 831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62 not found: ID does not exist" containerID="831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.723793 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62"} err="failed to get container status \"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62\": rpc error: code = NotFound desc = could not find container \"831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62\": container with ID starting with 831c0b295bbcc0eee583b3247a887cdd5a9f7abc2fafc098ba177ebb72c41c62 not found: ID does not exist" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.723819 4824 scope.go:117] "RemoveContainer" containerID="5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7" Feb 24 00:22:30 crc kubenswrapper[4824]: E0224 00:22:30.724156 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7\": container with ID starting with 5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7 not found: ID does not exist" containerID="5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.724239 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7"} err="failed to get container status \"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7\": rpc error: code = NotFound desc = could not find container \"5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7\": container with ID starting with 5bd458f8ab6e18655e507e975fc2e2718159f8ffd4e3a3ea392e57cab31af1b7 not found: ID does not exist" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.724280 4824 scope.go:117] "RemoveContainer" containerID="465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2" Feb 24 00:22:30 crc kubenswrapper[4824]: E0224 00:22:30.724995 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2\": container with ID starting with 465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2 not found: ID does not exist" containerID="465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.725062 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2"} err="failed to get container status \"465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2\": rpc error: code = NotFound desc = could not find container \"465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2\": container with ID starting with 465fc0bdabcaaeddc87638959d003ee0489fecdf8b29583f92b701894d90d1f2 not found: ID does not exist" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.726281 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e34f5eb-6bdc-406e-aaed-ff979c64f6db" (UID: "5e34f5eb-6bdc-406e-aaed-ff979c64f6db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.799949 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e34f5eb-6bdc-406e-aaed-ff979c64f6db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.983085 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:22:30 crc kubenswrapper[4824]: I0224 00:22:30.992389 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q2g4m"] Feb 24 00:22:32 crc kubenswrapper[4824]: I0224 00:22:32.707901 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" path="/var/lib/kubelet/pods/5e34f5eb-6bdc-406e-aaed-ff979c64f6db/volumes" Feb 24 00:22:33 crc kubenswrapper[4824]: I0224 00:22:33.064332 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:33 crc kubenswrapper[4824]: I0224 00:22:33.065282 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:33 crc kubenswrapper[4824]: I0224 00:22:33.138598 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:33 crc kubenswrapper[4824]: I0224 00:22:33.734560 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:34 crc kubenswrapper[4824]: I0224 00:22:34.790730 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:36 crc kubenswrapper[4824]: I0224 00:22:36.701452 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lzf7f" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="registry-server" containerID="cri-o://1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa" gracePeriod=2 Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.102559 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.198741 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities\") pod \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.198838 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content\") pod \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.198874 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drcs5\" (UniqueName: \"kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5\") pod \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\" (UID: \"8ad9454d-0615-4303-a59f-fbcc1d18c56a\") " Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.200077 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities" (OuterVolumeSpecName: "utilities") pod "8ad9454d-0615-4303-a59f-fbcc1d18c56a" (UID: "8ad9454d-0615-4303-a59f-fbcc1d18c56a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.209501 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5" (OuterVolumeSpecName: "kube-api-access-drcs5") pod "8ad9454d-0615-4303-a59f-fbcc1d18c56a" (UID: "8ad9454d-0615-4303-a59f-fbcc1d18c56a"). InnerVolumeSpecName "kube-api-access-drcs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.253058 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ad9454d-0615-4303-a59f-fbcc1d18c56a" (UID: "8ad9454d-0615-4303-a59f-fbcc1d18c56a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.300851 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.300888 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad9454d-0615-4303-a59f-fbcc1d18c56a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.300901 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drcs5\" (UniqueName: \"kubernetes.io/projected/8ad9454d-0615-4303-a59f-fbcc1d18c56a-kube-api-access-drcs5\") on node \"crc\" DevicePath \"\"" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.728724 4824 generic.go:334] "Generic (PLEG): container finished" podID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerID="1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa" exitCode=0 Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.728813 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerDied","Data":"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa"} Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.728867 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzf7f" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.728906 4824 scope.go:117] "RemoveContainer" containerID="1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.728882 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzf7f" event={"ID":"8ad9454d-0615-4303-a59f-fbcc1d18c56a","Type":"ContainerDied","Data":"f8e29f08a92667cb46b4ad4c4f5b0ccad1357b1ed5c00229f2037079821e8657"} Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.765828 4824 scope.go:117] "RemoveContainer" containerID="8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.805872 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.808901 4824 scope.go:117] "RemoveContainer" containerID="1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.812275 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lzf7f"] Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.829011 4824 scope.go:117] "RemoveContainer" containerID="1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa" Feb 24 00:22:37 crc kubenswrapper[4824]: E0224 00:22:37.831122 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa\": container with ID starting with 1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa not found: ID does not exist" containerID="1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.831172 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa"} err="failed to get container status \"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa\": rpc error: code = NotFound desc = could not find container \"1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa\": container with ID starting with 1c3d6d8ef243ea2dde76009ae12999ae54312654d0dda131ae174f8dbc2faffa not found: ID does not exist" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.831203 4824 scope.go:117] "RemoveContainer" containerID="8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929" Feb 24 00:22:37 crc kubenswrapper[4824]: E0224 00:22:37.833813 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929\": container with ID starting with 8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929 not found: ID does not exist" containerID="8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.833840 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929"} err="failed to get container status \"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929\": rpc error: code = NotFound desc = could not find container \"8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929\": container with ID starting with 8ff769ada3b3ef2e8697e389c71973fa6f1d0e76779e9a02a3afdec95938f929 not found: ID does not exist" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.833856 4824 scope.go:117] "RemoveContainer" containerID="1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460" Feb 24 00:22:37 crc kubenswrapper[4824]: E0224 00:22:37.834126 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460\": container with ID starting with 1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460 not found: ID does not exist" containerID="1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460" Feb 24 00:22:37 crc kubenswrapper[4824]: I0224 00:22:37.834176 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460"} err="failed to get container status \"1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460\": rpc error: code = NotFound desc = could not find container \"1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460\": container with ID starting with 1092c6a7e447e77503d91c65c3c5f6b11d0fed499e29b15c1ce9420c3b4bd460 not found: ID does not exist" Feb 24 00:22:38 crc kubenswrapper[4824]: I0224 00:22:38.706789 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" path="/var/lib/kubelet/pods/8ad9454d-0615-4303-a59f-fbcc1d18c56a/volumes" Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.276486 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.277386 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.277448 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.278264 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.278330 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28" gracePeriod=600 Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.845979 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28" exitCode=0 Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.846070 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28"} Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.846441 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb"} Feb 24 00:22:53 crc kubenswrapper[4824]: I0224 00:22:53.846465 4824 scope.go:117] "RemoveContainer" containerID="43fc5998f7ab77a1ca73519cb6a4280f5869d3a50153e1dc6202d26bc4d9b6a3" Feb 24 00:23:11 crc kubenswrapper[4824]: I0224 00:23:11.987110 4824 generic.go:334] "Generic (PLEG): container finished" podID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerID="9aa60fad0c5e5b125c85a695ee9699aa7edfa41691acaf3323a949446eac0c81" exitCode=0 Feb 24 00:23:11 crc kubenswrapper[4824]: I0224 00:23:11.987175 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerDied","Data":"9aa60fad0c5e5b125c85a695ee9699aa7edfa41691acaf3323a949446eac0c81"} Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.300712 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346232 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346309 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346355 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346411 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346485 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346501 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346541 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346582 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346605 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346645 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49jr7\" (UniqueName: \"kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346702 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346729 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root\") pod \"87c4f157-f66c-485a-b29d-db482b59c2a1\" (UID: \"87c4f157-f66c-485a-b29d-db482b59c2a1\") " Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.346809 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.347056 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.347511 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.347545 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.348338 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.348941 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.352437 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.357849 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.365625 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7" (OuterVolumeSpecName: "kube-api-access-49jr7") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "kube-api-access-49jr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.369975 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.375491 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.447593 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.447945 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448095 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49jr7\" (UniqueName: \"kubernetes.io/projected/87c4f157-f66c-485a-b29d-db482b59c2a1-kube-api-access-49jr7\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448200 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/87c4f157-f66c-485a-b29d-db482b59c2a1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448303 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448408 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448507 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/87c4f157-f66c-485a-b29d-db482b59c2a1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448611 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.448682 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/87c4f157-f66c-485a-b29d-db482b59c2a1-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.556541 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:13 crc kubenswrapper[4824]: I0224 00:23:13.651745 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:14 crc kubenswrapper[4824]: I0224 00:23:14.007079 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"87c4f157-f66c-485a-b29d-db482b59c2a1","Type":"ContainerDied","Data":"360ddab163ca47c1a0fb0338c2f478a6b9ea9e3e2cf8ffd6966eab886336bcb9"} Feb 24 00:23:14 crc kubenswrapper[4824]: I0224 00:23:14.007509 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="360ddab163ca47c1a0fb0338c2f478a6b9ea9e3e2cf8ffd6966eab886336bcb9" Feb 24 00:23:14 crc kubenswrapper[4824]: I0224 00:23:14.007369 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Feb 24 00:23:15 crc kubenswrapper[4824]: I0224 00:23:15.137110 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "87c4f157-f66c-485a-b29d-db482b59c2a1" (UID: "87c4f157-f66c-485a-b29d-db482b59c2a1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:15 crc kubenswrapper[4824]: I0224 00:23:15.185227 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/87c4f157-f66c-485a-b29d-db482b59c2a1-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.933233 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934252 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934274 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934294 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="git-clone" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934302 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="git-clone" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934320 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934329 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934345 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="manage-dockerfile" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934352 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="manage-dockerfile" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934361 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934370 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934378 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934385 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="extract-utilities" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934395 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="docker-build" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934401 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="docker-build" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934470 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934479 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934490 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934498 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934509 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934536 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934555 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934563 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: E0224 00:23:17.934579 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934587 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="extract-content" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934753 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad9454d-0615-4303-a59f-fbcc1d18c56a" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934765 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="5daf2179-5386-4221-a15d-0e9787959357" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934781 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e34f5eb-6bdc-406e-aaed-ff979c64f6db" containerName="registry-server" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.934791 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c4f157-f66c-485a-b29d-db482b59c2a1" containerName="docker-build" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.935558 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.943238 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.943491 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.943722 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.943875 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Feb 24 00:23:17 crc kubenswrapper[4824]: I0224 00:23:17.963949 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.130966 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.131040 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.131063 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132203 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132285 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjxw\" (UniqueName: \"kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132360 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132395 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132464 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132613 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132714 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132758 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.132990 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.234999 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235057 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235084 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235108 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235148 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235175 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235424 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.235166 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236015 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236467 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236604 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236733 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236807 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236893 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.236988 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237105 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237216 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjxw\" (UniqueName: \"kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237159 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237291 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237440 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.237829 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.242343 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.242408 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.261164 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjxw\" (UniqueName: \"kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw\") pod \"sg-core-1-build\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " pod="service-telemetry/sg-core-1-build" Feb 24 00:23:18 crc kubenswrapper[4824]: I0224 00:23:18.562312 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 24 00:23:19 crc kubenswrapper[4824]: I0224 00:23:19.003749 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:19 crc kubenswrapper[4824]: I0224 00:23:19.041187 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b","Type":"ContainerStarted","Data":"19bfab34288e937fc67ab246fc7a588fa300175ed336c770d21c78c7fed4ad39"} Feb 24 00:23:20 crc kubenswrapper[4824]: I0224 00:23:20.050693 4824 generic.go:334] "Generic (PLEG): container finished" podID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerID="7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1" exitCode=0 Feb 24 00:23:20 crc kubenswrapper[4824]: I0224 00:23:20.050760 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b","Type":"ContainerDied","Data":"7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1"} Feb 24 00:23:21 crc kubenswrapper[4824]: I0224 00:23:21.060171 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b","Type":"ContainerStarted","Data":"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01"} Feb 24 00:23:21 crc kubenswrapper[4824]: I0224 00:23:21.085558 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.08553601 podStartE2EDuration="4.08553601s" podCreationTimestamp="2026-02-24 00:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:23:21.084798651 +0000 UTC m=+1065.074423120" watchObservedRunningTime="2026-02-24 00:23:21.08553601 +0000 UTC m=+1065.075160489" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.271239 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.272009 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="docker-build" containerID="cri-o://c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01" gracePeriod=30 Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.659898 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_db72b2fe-6196-47d8-bbff-4e52d2fa9d9b/docker-build/0.log" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.661029 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804708 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804842 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804914 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804944 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsjxw\" (UniqueName: \"kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804969 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.804995 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805073 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805352 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805399 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805422 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805446 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805481 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets\") pod \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\" (UID: \"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b\") " Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805663 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805720 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.805728 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.806273 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.807477 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.807605 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.807638 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808081 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808456 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808481 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808499 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808540 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808558 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.808579 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.813732 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.813943 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw" (OuterVolumeSpecName: "kube-api-access-zsjxw") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "kube-api-access-zsjxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.818747 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.910287 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.910685 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsjxw\" (UniqueName: \"kubernetes.io/projected/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-kube-api-access-zsjxw\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.910768 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.936783 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:28 crc kubenswrapper[4824]: I0224 00:23:28.950170 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" (UID: "db72b2fe-6196-47d8-bbff-4e52d2fa9d9b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.012716 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.012765 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.119459 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_db72b2fe-6196-47d8-bbff-4e52d2fa9d9b/docker-build/0.log" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.120058 4824 generic.go:334] "Generic (PLEG): container finished" podID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerID="c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01" exitCode=1 Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.120126 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b","Type":"ContainerDied","Data":"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01"} Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.120147 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.120172 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"db72b2fe-6196-47d8-bbff-4e52d2fa9d9b","Type":"ContainerDied","Data":"19bfab34288e937fc67ab246fc7a588fa300175ed336c770d21c78c7fed4ad39"} Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.120195 4824 scope.go:117] "RemoveContainer" containerID="c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.151151 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.156249 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.172001 4824 scope.go:117] "RemoveContainer" containerID="7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.202324 4824 scope.go:117] "RemoveContainer" containerID="c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01" Feb 24 00:23:29 crc kubenswrapper[4824]: E0224 00:23:29.202926 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01\": container with ID starting with c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01 not found: ID does not exist" containerID="c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.202988 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01"} err="failed to get container status \"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01\": rpc error: code = NotFound desc = could not find container \"c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01\": container with ID starting with c19054bd4aa49237cbc4c413f5e316849d64d7268bfda6b2fafaf463a52bcf01 not found: ID does not exist" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.203021 4824 scope.go:117] "RemoveContainer" containerID="7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1" Feb 24 00:23:29 crc kubenswrapper[4824]: E0224 00:23:29.203352 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1\": container with ID starting with 7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1 not found: ID does not exist" containerID="7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.203372 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1"} err="failed to get container status \"7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1\": rpc error: code = NotFound desc = could not find container \"7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1\": container with ID starting with 7a64796f61c6b30ad2646949ad30cbd46a80419e26ed2eb9237a6fafd7349ce1 not found: ID does not exist" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.941824 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 24 00:23:29 crc kubenswrapper[4824]: E0224 00:23:29.942225 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="manage-dockerfile" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.942250 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="manage-dockerfile" Feb 24 00:23:29 crc kubenswrapper[4824]: E0224 00:23:29.942283 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="docker-build" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.942294 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="docker-build" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.942491 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" containerName="docker-build" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.943888 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.946392 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.947732 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.948106 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.950794 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Feb 24 00:23:29 crc kubenswrapper[4824]: I0224 00:23:29.953318 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.128963 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129006 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129035 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129248 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129287 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129358 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129394 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129430 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129459 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129537 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcgr\" (UniqueName: \"kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129603 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.129632 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230496 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230585 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230620 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230639 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230662 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230715 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230737 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230794 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230815 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230841 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230863 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230883 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcgr\" (UniqueName: \"kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.230950 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.231272 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.231566 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.231844 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.231880 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.232131 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.232328 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.232867 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.233110 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.239079 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.239116 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.253262 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcgr\" (UniqueName: \"kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr\") pod \"sg-core-2-build\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.302047 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.533247 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Feb 24 00:23:30 crc kubenswrapper[4824]: I0224 00:23:30.703239 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db72b2fe-6196-47d8-bbff-4e52d2fa9d9b" path="/var/lib/kubelet/pods/db72b2fe-6196-47d8-bbff-4e52d2fa9d9b/volumes" Feb 24 00:23:31 crc kubenswrapper[4824]: I0224 00:23:31.138489 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerStarted","Data":"010655e9ac8d5df123bdac2e01019ab3d59a1beff08289046366fcfe142aaa7e"} Feb 24 00:23:31 crc kubenswrapper[4824]: I0224 00:23:31.138906 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerStarted","Data":"b8562531694c8525f4bf4aec2b90a7ef10fb034a969d79ee542b0dea17176f13"} Feb 24 00:23:31 crc kubenswrapper[4824]: E0224 00:23:31.287018 4824 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.151:37134->38.102.83.151:38559: read tcp 38.102.83.151:37134->38.102.83.151:38559: read: connection reset by peer Feb 24 00:23:32 crc kubenswrapper[4824]: I0224 00:23:32.146907 4824 generic.go:334] "Generic (PLEG): container finished" podID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerID="010655e9ac8d5df123bdac2e01019ab3d59a1beff08289046366fcfe142aaa7e" exitCode=0 Feb 24 00:23:32 crc kubenswrapper[4824]: I0224 00:23:32.146979 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerDied","Data":"010655e9ac8d5df123bdac2e01019ab3d59a1beff08289046366fcfe142aaa7e"} Feb 24 00:23:33 crc kubenswrapper[4824]: I0224 00:23:33.156319 4824 generic.go:334] "Generic (PLEG): container finished" podID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerID="556b307c7c6ac911375fbe93a286eb266325bd855d629f38954bc83024b67b07" exitCode=0 Feb 24 00:23:33 crc kubenswrapper[4824]: I0224 00:23:33.156422 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerDied","Data":"556b307c7c6ac911375fbe93a286eb266325bd855d629f38954bc83024b67b07"} Feb 24 00:23:33 crc kubenswrapper[4824]: I0224 00:23:33.201611 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_77c0907d-81a1-4a0c-81cf-cac502f6c8dc/manage-dockerfile/0.log" Feb 24 00:23:34 crc kubenswrapper[4824]: I0224 00:23:34.170421 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerStarted","Data":"5924f945ac9743d4f5656ab07e6825552fcd0d8f58684ebca3f29b652db3d784"} Feb 24 00:23:34 crc kubenswrapper[4824]: I0224 00:23:34.211961 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.211935346 podStartE2EDuration="5.211935346s" podCreationTimestamp="2026-02-24 00:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:23:34.2062421 +0000 UTC m=+1078.195866599" watchObservedRunningTime="2026-02-24 00:23:34.211935346 +0000 UTC m=+1078.201559825" Feb 24 00:24:53 crc kubenswrapper[4824]: I0224 00:24:53.276756 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:24:53 crc kubenswrapper[4824]: I0224 00:24:53.277712 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:25:23 crc kubenswrapper[4824]: I0224 00:25:23.276711 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:25:23 crc kubenswrapper[4824]: I0224 00:25:23.277670 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:25:53 crc kubenswrapper[4824]: I0224 00:25:53.276111 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:25:53 crc kubenswrapper[4824]: I0224 00:25:53.276882 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:25:53 crc kubenswrapper[4824]: I0224 00:25:53.276943 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:25:53 crc kubenswrapper[4824]: I0224 00:25:53.278584 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:25:53 crc kubenswrapper[4824]: I0224 00:25:53.278735 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb" gracePeriod=600 Feb 24 00:25:54 crc kubenswrapper[4824]: I0224 00:25:54.223103 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb" exitCode=0 Feb 24 00:25:54 crc kubenswrapper[4824]: I0224 00:25:54.223171 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb"} Feb 24 00:25:54 crc kubenswrapper[4824]: I0224 00:25:54.223664 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80"} Feb 24 00:25:54 crc kubenswrapper[4824]: I0224 00:25:54.223699 4824 scope.go:117] "RemoveContainer" containerID="960af4768f01705e18d424766fe08bb2ebb2088d821d2cb697f31ab6e24ccd28" Feb 24 00:26:46 crc kubenswrapper[4824]: I0224 00:26:46.621266 4824 generic.go:334] "Generic (PLEG): container finished" podID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerID="5924f945ac9743d4f5656ab07e6825552fcd0d8f58684ebca3f29b652db3d784" exitCode=0 Feb 24 00:26:46 crc kubenswrapper[4824]: I0224 00:26:46.621358 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerDied","Data":"5924f945ac9743d4f5656ab07e6825552fcd0d8f58684ebca3f29b652db3d784"} Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.867936 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929321 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929461 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929506 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929544 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929598 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929627 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcgr\" (UniqueName: \"kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929667 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929713 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929753 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929788 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929862 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.929897 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root\") pod \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\" (UID: \"77c0907d-81a1-4a0c-81cf-cac502f6c8dc\") " Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.931056 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.931784 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.933638 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.934139 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.934192 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.934206 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.939856 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.940727 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.940822 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr" (OuterVolumeSpecName: "kube-api-access-4mcgr") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "kube-api-access-4mcgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:26:47 crc kubenswrapper[4824]: I0224 00:26:47.948684 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031737 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031786 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031796 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031808 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031818 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mcgr\" (UniqueName: \"kubernetes.io/projected/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-kube-api-access-4mcgr\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031829 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031838 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031847 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031856 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.031866 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.323925 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.337139 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.639165 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"77c0907d-81a1-4a0c-81cf-cac502f6c8dc","Type":"ContainerDied","Data":"b8562531694c8525f4bf4aec2b90a7ef10fb034a969d79ee542b0dea17176f13"} Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.639231 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8562531694c8525f4bf4aec2b90a7ef10fb034a969d79ee542b0dea17176f13" Feb 24 00:26:48 crc kubenswrapper[4824]: I0224 00:26:48.639232 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Feb 24 00:26:50 crc kubenswrapper[4824]: I0224 00:26:50.535683 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "77c0907d-81a1-4a0c-81cf-cac502f6c8dc" (UID: "77c0907d-81a1-4a0c-81cf-cac502f6c8dc"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:26:50 crc kubenswrapper[4824]: I0224 00:26:50.578173 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/77c0907d-81a1-4a0c-81cf-cac502f6c8dc-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.665335 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:26:52 crc kubenswrapper[4824]: E0224 00:26:52.666080 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="git-clone" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.666097 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="git-clone" Feb 24 00:26:52 crc kubenswrapper[4824]: E0224 00:26:52.666118 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="docker-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.666124 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="docker-build" Feb 24 00:26:52 crc kubenswrapper[4824]: E0224 00:26:52.666136 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="manage-dockerfile" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.666142 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="manage-dockerfile" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.666268 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c0907d-81a1-4a0c-81cf-cac502f6c8dc" containerName="docker-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.667065 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.669442 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.669809 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.670014 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.670263 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.704541 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734607 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734661 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734706 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734732 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734754 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734790 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734816 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734849 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734888 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734923 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.734990 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.735021 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49sqs\" (UniqueName: \"kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837058 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837200 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837255 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837313 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837337 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49sqs\" (UniqueName: \"kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837861 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837919 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.837962 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838013 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838052 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838096 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838121 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838315 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838570 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838616 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838617 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838658 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838679 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.838788 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:52 crc kubenswrapper[4824]: I0224 00:26:52.839022 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.324976 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.325639 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.325684 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.329057 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49sqs\" (UniqueName: \"kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs\") pod \"sg-bridge-1-build\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.587884 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 24 00:26:53 crc kubenswrapper[4824]: I0224 00:26:53.811000 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:26:54 crc kubenswrapper[4824]: I0224 00:26:54.682925 4824 generic.go:334] "Generic (PLEG): container finished" podID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerID="6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9" exitCode=0 Feb 24 00:26:54 crc kubenswrapper[4824]: I0224 00:26:54.683012 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"af6c88fc-9fa1-46aa-9060-3d202479481c","Type":"ContainerDied","Data":"6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9"} Feb 24 00:26:54 crc kubenswrapper[4824]: I0224 00:26:54.683543 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"af6c88fc-9fa1-46aa-9060-3d202479481c","Type":"ContainerStarted","Data":"1dd6865d475a5af0f51171b6fff366385505696bcae6478e182f5800edc3ac14"} Feb 24 00:26:55 crc kubenswrapper[4824]: I0224 00:26:55.695015 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"af6c88fc-9fa1-46aa-9060-3d202479481c","Type":"ContainerStarted","Data":"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f"} Feb 24 00:26:55 crc kubenswrapper[4824]: I0224 00:26:55.726028 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.7260061110000002 podStartE2EDuration="3.726006111s" podCreationTimestamp="2026-02-24 00:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:26:55.720337678 +0000 UTC m=+1279.709962167" watchObservedRunningTime="2026-02-24 00:26:55.726006111 +0000 UTC m=+1279.715630600" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.334084 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.335031 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="docker-build" containerID="cri-o://36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f" gracePeriod=30 Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.697447 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_af6c88fc-9fa1-46aa-9060-3d202479481c/docker-build/0.log" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.698234 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712091 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712623 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712706 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712753 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712786 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712842 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49sqs\" (UniqueName: \"kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712890 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712932 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712965 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.712993 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.713054 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.713085 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache\") pod \"af6c88fc-9fa1-46aa-9060-3d202479481c\" (UID: \"af6c88fc-9fa1-46aa-9060-3d202479481c\") " Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.714010 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.714526 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.715014 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.715029 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.715475 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.715576 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.715846 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.718503 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.719145 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs" (OuterVolumeSpecName: "kube-api-access-49sqs") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "kube-api-access-49sqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.719653 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766163 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_af6c88fc-9fa1-46aa-9060-3d202479481c/docker-build/0.log" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766631 4824 generic.go:334] "Generic (PLEG): container finished" podID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerID="36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f" exitCode=1 Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766701 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"af6c88fc-9fa1-46aa-9060-3d202479481c","Type":"ContainerDied","Data":"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f"} Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766780 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"af6c88fc-9fa1-46aa-9060-3d202479481c","Type":"ContainerDied","Data":"1dd6865d475a5af0f51171b6fff366385505696bcae6478e182f5800edc3ac14"} Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766771 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.766800 4824 scope.go:117] "RemoveContainer" containerID="36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.789919 4824 scope.go:117] "RemoveContainer" containerID="6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.810018 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815030 4824 scope.go:117] "RemoveContainer" containerID="36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815308 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815340 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815352 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815361 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49sqs\" (UniqueName: \"kubernetes.io/projected/af6c88fc-9fa1-46aa-9060-3d202479481c-kube-api-access-49sqs\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815370 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af6c88fc-9fa1-46aa-9060-3d202479481c-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815379 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815388 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815399 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815407 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815415 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/af6c88fc-9fa1-46aa-9060-3d202479481c-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815423 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af6c88fc-9fa1-46aa-9060-3d202479481c-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:03 crc kubenswrapper[4824]: E0224 00:27:03.815543 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f\": container with ID starting with 36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f not found: ID does not exist" containerID="36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815591 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f"} err="failed to get container status \"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f\": rpc error: code = NotFound desc = could not find container \"36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f\": container with ID starting with 36ea10b047001c97d42518d7957d7822f59fe55dfe1fae6a92c5691f9b7ed45f not found: ID does not exist" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.815628 4824 scope.go:117] "RemoveContainer" containerID="6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9" Feb 24 00:27:03 crc kubenswrapper[4824]: E0224 00:27:03.815980 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9\": container with ID starting with 6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9 not found: ID does not exist" containerID="6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9" Feb 24 00:27:03 crc kubenswrapper[4824]: I0224 00:27:03.816017 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9"} err="failed to get container status \"6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9\": rpc error: code = NotFound desc = could not find container \"6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9\": container with ID starting with 6597cb9e10c28b83d05743f49d3783a414315524b71024badd2e95d60b7946b9 not found: ID does not exist" Feb 24 00:27:04 crc kubenswrapper[4824]: I0224 00:27:04.096153 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "af6c88fc-9fa1-46aa-9060-3d202479481c" (UID: "af6c88fc-9fa1-46aa-9060-3d202479481c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:04 crc kubenswrapper[4824]: I0224 00:27:04.120428 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af6c88fc-9fa1-46aa-9060-3d202479481c-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:04 crc kubenswrapper[4824]: I0224 00:27:04.412169 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:27:04 crc kubenswrapper[4824]: I0224 00:27:04.419039 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Feb 24 00:27:04 crc kubenswrapper[4824]: I0224 00:27:04.703148 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" path="/var/lib/kubelet/pods/af6c88fc-9fa1-46aa-9060-3d202479481c/volumes" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.168560 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 24 00:27:05 crc kubenswrapper[4824]: E0224 00:27:05.169372 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="manage-dockerfile" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.169457 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="manage-dockerfile" Feb 24 00:27:05 crc kubenswrapper[4824]: E0224 00:27:05.169573 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="docker-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.169652 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="docker-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.169865 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6c88fc-9fa1-46aa-9060-3d202479481c" containerName="docker-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.171027 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.173762 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.173834 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.174178 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.175605 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.192183 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235099 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235158 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235183 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235205 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235226 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235483 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvml\" (UniqueName: \"kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235611 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235733 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235823 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235868 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.235962 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.236092 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.337488 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.337874 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.337961 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338089 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338184 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338279 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338222 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338424 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvml\" (UniqueName: \"kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338472 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338676 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338744 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338795 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338868 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338887 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.338887 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.339029 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.339168 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.339205 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.339394 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.339848 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.340020 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.354743 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.356215 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.358058 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvml\" (UniqueName: \"kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml\") pod \"sg-bridge-2-build\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.488644 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.744856 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Feb 24 00:27:05 crc kubenswrapper[4824]: I0224 00:27:05.787545 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerStarted","Data":"954236d9ba17ac1f320cbfdd5017cf9874f771c683ae5d5acffcb803cd49b53e"} Feb 24 00:27:06 crc kubenswrapper[4824]: I0224 00:27:06.798466 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerStarted","Data":"da3bb2a807bfead3b4300c93db29f7a5d7e778cfc90eded40f001b306353f2f6"} Feb 24 00:27:07 crc kubenswrapper[4824]: I0224 00:27:07.807614 4824 generic.go:334] "Generic (PLEG): container finished" podID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerID="da3bb2a807bfead3b4300c93db29f7a5d7e778cfc90eded40f001b306353f2f6" exitCode=0 Feb 24 00:27:07 crc kubenswrapper[4824]: I0224 00:27:07.807701 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerDied","Data":"da3bb2a807bfead3b4300c93db29f7a5d7e778cfc90eded40f001b306353f2f6"} Feb 24 00:27:08 crc kubenswrapper[4824]: I0224 00:27:08.819343 4824 generic.go:334] "Generic (PLEG): container finished" podID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerID="d8656546214a503366dda5373f672466c18181aa2f185edec71172c976bece6d" exitCode=0 Feb 24 00:27:08 crc kubenswrapper[4824]: I0224 00:27:08.819446 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerDied","Data":"d8656546214a503366dda5373f672466c18181aa2f185edec71172c976bece6d"} Feb 24 00:27:08 crc kubenswrapper[4824]: I0224 00:27:08.866187 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_e64cc79f-399d-4a53-b509-a6618d565cbf/manage-dockerfile/0.log" Feb 24 00:27:09 crc kubenswrapper[4824]: I0224 00:27:09.831489 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerStarted","Data":"09c4d2134f2b9fafd8744d2eff2f7d16da5b901065d6281daff1b2c6fd989a55"} Feb 24 00:27:09 crc kubenswrapper[4824]: I0224 00:27:09.880998 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=4.880968065 podStartE2EDuration="4.880968065s" podCreationTimestamp="2026-02-24 00:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:27:09.872905512 +0000 UTC m=+1293.862530021" watchObservedRunningTime="2026-02-24 00:27:09.880968065 +0000 UTC m=+1293.870592584" Feb 24 00:27:53 crc kubenswrapper[4824]: I0224 00:27:53.276180 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:27:53 crc kubenswrapper[4824]: I0224 00:27:53.277132 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:27:54 crc kubenswrapper[4824]: I0224 00:27:54.137697 4824 generic.go:334] "Generic (PLEG): container finished" podID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerID="09c4d2134f2b9fafd8744d2eff2f7d16da5b901065d6281daff1b2c6fd989a55" exitCode=0 Feb 24 00:27:54 crc kubenswrapper[4824]: I0224 00:27:54.137776 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerDied","Data":"09c4d2134f2b9fafd8744d2eff2f7d16da5b901065d6281daff1b2c6fd989a55"} Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.474316 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.607898 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.607980 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnvml\" (UniqueName: \"kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608025 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608055 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608099 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608141 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608201 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608243 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608281 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608262 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608297 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608379 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608390 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608408 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull\") pod \"e64cc79f-399d-4a53-b509-a6618d565cbf\" (UID: \"e64cc79f-399d-4a53-b509-a6618d565cbf\") " Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608625 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.608637 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e64cc79f-399d-4a53-b509-a6618d565cbf-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.609770 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.609905 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.610314 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.610665 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.610851 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.615912 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml" (OuterVolumeSpecName: "kube-api-access-bnvml") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "kube-api-access-bnvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.616034 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.617114 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710429 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnvml\" (UniqueName: \"kubernetes.io/projected/e64cc79f-399d-4a53-b509-a6618d565cbf-kube-api-access-bnvml\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710490 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710509 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710563 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710583 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710605 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710623 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e64cc79f-399d-4a53-b509-a6618d565cbf-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.710642 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/e64cc79f-399d-4a53-b509-a6618d565cbf-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.740474 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:55 crc kubenswrapper[4824]: I0224 00:27:55.812199 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:56 crc kubenswrapper[4824]: I0224 00:27:56.154185 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"e64cc79f-399d-4a53-b509-a6618d565cbf","Type":"ContainerDied","Data":"954236d9ba17ac1f320cbfdd5017cf9874f771c683ae5d5acffcb803cd49b53e"} Feb 24 00:27:56 crc kubenswrapper[4824]: I0224 00:27:56.154235 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="954236d9ba17ac1f320cbfdd5017cf9874f771c683ae5d5acffcb803cd49b53e" Feb 24 00:27:56 crc kubenswrapper[4824]: I0224 00:27:56.154341 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Feb 24 00:27:56 crc kubenswrapper[4824]: I0224 00:27:56.324040 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e64cc79f-399d-4a53-b509-a6618d565cbf" (UID: "e64cc79f-399d-4a53-b509-a6618d565cbf"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:27:56 crc kubenswrapper[4824]: I0224 00:27:56.420368 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e64cc79f-399d-4a53-b509-a6618d565cbf-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.449337 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:27:59 crc kubenswrapper[4824]: E0224 00:27:59.450629 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="git-clone" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.450649 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="git-clone" Feb 24 00:27:59 crc kubenswrapper[4824]: E0224 00:27:59.450664 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="docker-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.450671 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="docker-build" Feb 24 00:27:59 crc kubenswrapper[4824]: E0224 00:27:59.450694 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="manage-dockerfile" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.450704 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="manage-dockerfile" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.450850 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64cc79f-399d-4a53-b509-a6618d565cbf" containerName="docker-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.451824 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.456057 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.457809 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.458809 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.458957 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.468067 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.579010 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc2pf\" (UniqueName: \"kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.579708 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.579920 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580020 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580111 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580229 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580372 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580418 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580468 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580621 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580731 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.580778 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682379 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc2pf\" (UniqueName: \"kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682462 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682486 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682507 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682546 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682573 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682611 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682632 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682661 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682682 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682708 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.682733 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.683278 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.683532 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.683715 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.684194 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.684186 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.684605 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.684934 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.685046 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.685241 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.694544 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.694695 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.705299 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc2pf\" (UniqueName: \"kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:27:59 crc kubenswrapper[4824]: I0224 00:27:59.770140 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:28:00 crc kubenswrapper[4824]: I0224 00:28:00.070363 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:28:00 crc kubenswrapper[4824]: I0224 00:28:00.187041 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"b4ba4a5c-0b11-4224-93c1-916afc845dad","Type":"ContainerStarted","Data":"a0b3e777855e84eda295c0db150df985df6c58f2774ea136065601b5d232eb13"} Feb 24 00:28:01 crc kubenswrapper[4824]: I0224 00:28:01.195875 4824 generic.go:334] "Generic (PLEG): container finished" podID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerID="3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb" exitCode=0 Feb 24 00:28:01 crc kubenswrapper[4824]: I0224 00:28:01.195992 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"b4ba4a5c-0b11-4224-93c1-916afc845dad","Type":"ContainerDied","Data":"3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb"} Feb 24 00:28:02 crc kubenswrapper[4824]: I0224 00:28:02.205637 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"b4ba4a5c-0b11-4224-93c1-916afc845dad","Type":"ContainerStarted","Data":"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b"} Feb 24 00:28:10 crc kubenswrapper[4824]: I0224 00:28:10.554964 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=11.554938831 podStartE2EDuration="11.554938831s" podCreationTimestamp="2026-02-24 00:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:28:02.239202004 +0000 UTC m=+1346.228826493" watchObservedRunningTime="2026-02-24 00:28:10.554938831 +0000 UTC m=+1354.544563310" Feb 24 00:28:10 crc kubenswrapper[4824]: I0224 00:28:10.559715 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:28:10 crc kubenswrapper[4824]: I0224 00:28:10.560034 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="docker-build" containerID="cri-o://a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b" gracePeriod=30 Feb 24 00:28:10 crc kubenswrapper[4824]: I0224 00:28:10.979760 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_b4ba4a5c-0b11-4224-93c1-916afc845dad/docker-build/0.log" Feb 24 00:28:10 crc kubenswrapper[4824]: I0224 00:28:10.980472 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.047414 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.048694 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149190 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149277 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149323 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149381 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc2pf\" (UniqueName: \"kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149363 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149423 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149455 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149494 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149571 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149596 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149630 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149685 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.149753 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir\") pod \"b4ba4a5c-0b11-4224-93c1-916afc845dad\" (UID: \"b4ba4a5c-0b11-4224-93c1-916afc845dad\") " Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.150170 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.150193 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.150211 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.155882 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.156731 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.158246 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.158350 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.161434 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.162319 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf" (OuterVolumeSpecName: "kube-api-access-nc2pf") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "kube-api-access-nc2pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.162322 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250693 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250735 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250747 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc2pf\" (UniqueName: \"kubernetes.io/projected/b4ba4a5c-0b11-4224-93c1-916afc845dad-kube-api-access-nc2pf\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250758 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250771 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250783 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/b4ba4a5c-0b11-4224-93c1-916afc845dad-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.250794 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.254307 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.285018 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_b4ba4a5c-0b11-4224-93c1-916afc845dad/docker-build/0.log" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.286614 4824 generic.go:334] "Generic (PLEG): container finished" podID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerID="a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b" exitCode=1 Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.286656 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"b4ba4a5c-0b11-4224-93c1-916afc845dad","Type":"ContainerDied","Data":"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b"} Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.286692 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"b4ba4a5c-0b11-4224-93c1-916afc845dad","Type":"ContainerDied","Data":"a0b3e777855e84eda295c0db150df985df6c58f2774ea136065601b5d232eb13"} Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.286719 4824 scope.go:117] "RemoveContainer" containerID="a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.286879 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.312551 4824 scope.go:117] "RemoveContainer" containerID="3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.340063 4824 scope.go:117] "RemoveContainer" containerID="a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b" Feb 24 00:28:11 crc kubenswrapper[4824]: E0224 00:28:11.340547 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b\": container with ID starting with a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b not found: ID does not exist" containerID="a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.340607 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b"} err="failed to get container status \"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b\": rpc error: code = NotFound desc = could not find container \"a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b\": container with ID starting with a53ed1fad2fb3c6218dfc006e9192510785f4c7b5539ee92f99bf688c20a7f9b not found: ID does not exist" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.340648 4824 scope.go:117] "RemoveContainer" containerID="3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb" Feb 24 00:28:11 crc kubenswrapper[4824]: E0224 00:28:11.341020 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb\": container with ID starting with 3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb not found: ID does not exist" containerID="3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.341061 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb"} err="failed to get container status \"3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb\": rpc error: code = NotFound desc = could not find container \"3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb\": container with ID starting with 3168d2fe985406ad1ee57f8645b40dfea48be1b24606a88cad673ecfbe808feb not found: ID does not exist" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.351876 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.540211 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b4ba4a5c-0b11-4224-93c1-916afc845dad" (UID: "b4ba4a5c-0b11-4224-93c1-916afc845dad"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.555106 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b4ba4a5c-0b11-4224-93c1-916afc845dad-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.624956 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:28:11 crc kubenswrapper[4824]: I0224 00:28:11.632058 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.369220 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 24 00:28:12 crc kubenswrapper[4824]: E0224 00:28:12.369575 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="manage-dockerfile" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.369595 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="manage-dockerfile" Feb 24 00:28:12 crc kubenswrapper[4824]: E0224 00:28:12.369627 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="docker-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.369639 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="docker-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.369817 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" containerName="docker-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.371234 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.374098 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.374332 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-gqxxs" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.374379 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.375238 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.393723 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469257 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469394 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469437 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469506 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469582 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469626 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469675 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469729 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469778 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469878 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469927 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.469993 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42hgn\" (UniqueName: \"kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570641 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570729 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570762 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570789 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570844 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570878 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.570922 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571029 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571515 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42hgn\" (UniqueName: \"kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571584 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571609 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571631 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571690 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571712 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571497 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.571889 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.572061 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.572101 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.572136 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.572151 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.572237 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.575919 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.575929 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.598030 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42hgn\" (UniqueName: \"kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.693342 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.702174 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ba4a5c-0b11-4224-93c1-916afc845dad" path="/var/lib/kubelet/pods/b4ba4a5c-0b11-4224-93c1-916afc845dad/volumes" Feb 24 00:28:12 crc kubenswrapper[4824]: I0224 00:28:12.907399 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Feb 24 00:28:13 crc kubenswrapper[4824]: I0224 00:28:13.305110 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerStarted","Data":"83459e53dc9e550b588f73c15cde8e7449e5987cd6bdc17ae8e1b6e921e61f0c"} Feb 24 00:28:13 crc kubenswrapper[4824]: I0224 00:28:13.305463 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerStarted","Data":"7f43c818a6b6b7415752e417bc61876b49ff2671f72718851273b1acc4e80670"} Feb 24 00:28:14 crc kubenswrapper[4824]: I0224 00:28:14.318631 4824 generic.go:334] "Generic (PLEG): container finished" podID="2502d667-d99a-44c3-9d90-297fa992415f" containerID="83459e53dc9e550b588f73c15cde8e7449e5987cd6bdc17ae8e1b6e921e61f0c" exitCode=0 Feb 24 00:28:14 crc kubenswrapper[4824]: I0224 00:28:14.318710 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerDied","Data":"83459e53dc9e550b588f73c15cde8e7449e5987cd6bdc17ae8e1b6e921e61f0c"} Feb 24 00:28:15 crc kubenswrapper[4824]: I0224 00:28:15.327197 4824 generic.go:334] "Generic (PLEG): container finished" podID="2502d667-d99a-44c3-9d90-297fa992415f" containerID="cd7d2bcb10a00e2cffb8c401cf1245e43d358c4602903a675a36eb052d31ade4" exitCode=0 Feb 24 00:28:15 crc kubenswrapper[4824]: I0224 00:28:15.327251 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerDied","Data":"cd7d2bcb10a00e2cffb8c401cf1245e43d358c4602903a675a36eb052d31ade4"} Feb 24 00:28:15 crc kubenswrapper[4824]: I0224 00:28:15.363056 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_2502d667-d99a-44c3-9d90-297fa992415f/manage-dockerfile/0.log" Feb 24 00:28:16 crc kubenswrapper[4824]: I0224 00:28:16.337530 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerStarted","Data":"62efab338768a2b789db26b97a99b02cd2e06c98366a896fcb14f9e0c6b5a304"} Feb 24 00:28:23 crc kubenswrapper[4824]: I0224 00:28:23.275940 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:28:23 crc kubenswrapper[4824]: I0224 00:28:23.276820 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.276118 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.277190 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.277256 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.278231 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.278306 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80" gracePeriod=600 Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.636974 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80" exitCode=0 Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.637714 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80"} Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.637757 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364"} Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.637779 4824 scope.go:117] "RemoveContainer" containerID="32f31702ad77be87a49a0e3d023914422f1fbe192b728a29ed31dacaa99cc4eb" Feb 24 00:28:53 crc kubenswrapper[4824]: I0224 00:28:53.669040 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=41.669018191 podStartE2EDuration="41.669018191s" podCreationTimestamp="2026-02-24 00:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:28:16.363793021 +0000 UTC m=+1360.353417510" watchObservedRunningTime="2026-02-24 00:28:53.669018191 +0000 UTC m=+1397.658642650" Feb 24 00:29:19 crc kubenswrapper[4824]: I0224 00:29:19.834041 4824 generic.go:334] "Generic (PLEG): container finished" podID="2502d667-d99a-44c3-9d90-297fa992415f" containerID="62efab338768a2b789db26b97a99b02cd2e06c98366a896fcb14f9e0c6b5a304" exitCode=0 Feb 24 00:29:19 crc kubenswrapper[4824]: I0224 00:29:19.834274 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerDied","Data":"62efab338768a2b789db26b97a99b02cd2e06c98366a896fcb14f9e0c6b5a304"} Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.153683 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195469 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195613 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195675 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42hgn\" (UniqueName: \"kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195724 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195750 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195774 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195811 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195855 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195871 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195892 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.195920 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.196004 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull\") pod \"2502d667-d99a-44c3-9d90-297fa992415f\" (UID: \"2502d667-d99a-44c3-9d90-297fa992415f\") " Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.197335 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.197385 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.198255 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.198346 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.199231 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.199394 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.202186 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.205767 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-pull") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "builder-dockercfg-gqxxs-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.205786 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push" (OuterVolumeSpecName: "builder-dockercfg-gqxxs-push") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "builder-dockercfg-gqxxs-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.206135 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn" (OuterVolumeSpecName: "kube-api-access-42hgn") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "kube-api-access-42hgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306161 4824 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-buildcachedir\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306223 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42hgn\" (UniqueName: \"kubernetes.io/projected/2502d667-d99a-44c3-9d90-297fa992415f-kube-api-access-42hgn\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306237 4824 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306265 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-push\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-push\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306279 4824 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306291 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-run\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306305 4824 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/2502d667-d99a-44c3-9d90-297fa992415f-build-system-configs\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306319 4824 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-buildworkdir\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306349 4824 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2502d667-d99a-44c3-9d90-297fa992415f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.306362 4824 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-gqxxs-pull\" (UniqueName: \"kubernetes.io/secret/2502d667-d99a-44c3-9d90-297fa992415f-builder-dockercfg-gqxxs-pull\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.332904 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.407835 4824 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.853395 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"2502d667-d99a-44c3-9d90-297fa992415f","Type":"ContainerDied","Data":"7f43c818a6b6b7415752e417bc61876b49ff2671f72718851273b1acc4e80670"} Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.854410 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f43c818a6b6b7415752e417bc61876b49ff2671f72718851273b1acc4e80670" Feb 24 00:29:21 crc kubenswrapper[4824]: I0224 00:29:21.853482 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Feb 24 00:29:22 crc kubenswrapper[4824]: I0224 00:29:22.097764 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "2502d667-d99a-44c3-9d90-297fa992415f" (UID: "2502d667-d99a-44c3-9d90-297fa992415f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:29:22 crc kubenswrapper[4824]: I0224 00:29:22.119667 4824 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/2502d667-d99a-44c3-9d90-297fa992415f-container-storage-root\") on node \"crc\" DevicePath \"\"" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.894671 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-755b8777c-j59cx"] Feb 24 00:29:27 crc kubenswrapper[4824]: E0224 00:29:27.895202 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="git-clone" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.895215 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="git-clone" Feb 24 00:29:27 crc kubenswrapper[4824]: E0224 00:29:27.895227 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="manage-dockerfile" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.895233 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="manage-dockerfile" Feb 24 00:29:27 crc kubenswrapper[4824]: E0224 00:29:27.895246 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="docker-build" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.895252 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="docker-build" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.895378 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="2502d667-d99a-44c3-9d90-297fa992415f" containerName="docker-build" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.895904 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.898412 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-46ncq" Feb 24 00:29:27 crc kubenswrapper[4824]: I0224 00:29:27.954406 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-755b8777c-j59cx"] Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.006548 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58b9k\" (UniqueName: \"kubernetes.io/projected/99d102db-b6a5-428f-acec-1311a225325d-kube-api-access-58b9k\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.006660 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/99d102db-b6a5-428f-acec-1311a225325d-runner\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.107980 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58b9k\" (UniqueName: \"kubernetes.io/projected/99d102db-b6a5-428f-acec-1311a225325d-kube-api-access-58b9k\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.108057 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/99d102db-b6a5-428f-acec-1311a225325d-runner\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.108510 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/99d102db-b6a5-428f-acec-1311a225325d-runner\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.134885 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58b9k\" (UniqueName: \"kubernetes.io/projected/99d102db-b6a5-428f-acec-1311a225325d-kube-api-access-58b9k\") pod \"smart-gateway-operator-755b8777c-j59cx\" (UID: \"99d102db-b6a5-428f-acec-1311a225325d\") " pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.227736 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.437987 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-755b8777c-j59cx"] Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.440117 4824 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 00:29:28 crc kubenswrapper[4824]: I0224 00:29:28.903220 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" event={"ID":"99d102db-b6a5-428f-acec-1311a225325d","Type":"ContainerStarted","Data":"76b31fe4fe3a776a331527f9c94ab0067e4ef9a961f001cd8d79b9f195a4e6d3"} Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.475770 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz"] Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.484312 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.487195 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-xkzqf" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.526496 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz"] Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.607645 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspqc\" (UniqueName: \"kubernetes.io/projected/3394aaea-7658-498b-aab1-7494fb832c8f-kube-api-access-vspqc\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.607771 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3394aaea-7658-498b-aab1-7494fb832c8f-runner\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.709546 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspqc\" (UniqueName: \"kubernetes.io/projected/3394aaea-7658-498b-aab1-7494fb832c8f-kube-api-access-vspqc\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.709609 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3394aaea-7658-498b-aab1-7494fb832c8f-runner\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.710017 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3394aaea-7658-498b-aab1-7494fb832c8f-runner\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.733858 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspqc\" (UniqueName: \"kubernetes.io/projected/3394aaea-7658-498b-aab1-7494fb832c8f-kube-api-access-vspqc\") pod \"service-telemetry-operator-7f7c584b79-2rbxz\" (UID: \"3394aaea-7658-498b-aab1-7494fb832c8f\") " pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:33 crc kubenswrapper[4824]: I0224 00:29:33.821709 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" Feb 24 00:29:40 crc kubenswrapper[4824]: I0224 00:29:40.195155 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz"] Feb 24 00:29:42 crc kubenswrapper[4824]: W0224 00:29:42.904042 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3394aaea_7658_498b_aab1_7494fb832c8f.slice/crio-4f29de758e19b070405b2537ae2ab76fb433f996ec903bb1877f9209bf0a929b WatchSource:0}: Error finding container 4f29de758e19b070405b2537ae2ab76fb433f996ec903bb1877f9209bf0a929b: Status 404 returned error can't find the container with id 4f29de758e19b070405b2537ae2ab76fb433f996ec903bb1877f9209bf0a929b Feb 24 00:29:43 crc kubenswrapper[4824]: I0224 00:29:43.040681 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" event={"ID":"3394aaea-7658-498b-aab1-7494fb832c8f","Type":"ContainerStarted","Data":"4f29de758e19b070405b2537ae2ab76fb433f996ec903bb1877f9209bf0a929b"} Feb 24 00:29:46 crc kubenswrapper[4824]: E0224 00:29:46.313406 4824 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Feb 24 00:29:46 crc kubenswrapper[4824]: E0224 00:29:46.314339 4824 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1771892962,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58b9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-755b8777c-j59cx_service-telemetry(99d102db-b6a5-428f-acec-1311a225325d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 00:29:46 crc kubenswrapper[4824]: E0224 00:29:46.315726 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" podUID="99d102db-b6a5-428f-acec-1311a225325d" Feb 24 00:29:47 crc kubenswrapper[4824]: E0224 00:29:47.088834 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" podUID="99d102db-b6a5-428f-acec-1311a225325d" Feb 24 00:29:52 crc kubenswrapper[4824]: I0224 00:29:52.111454 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" event={"ID":"3394aaea-7658-498b-aab1-7494fb832c8f","Type":"ContainerStarted","Data":"7a02a539e07725635599d64121b36d64e50cd14340ba35cb917417835c176e3a"} Feb 24 00:29:52 crc kubenswrapper[4824]: I0224 00:29:52.146418 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-7f7c584b79-2rbxz" podStartSLOduration=10.881463113 podStartE2EDuration="19.146390696s" podCreationTimestamp="2026-02-24 00:29:33 +0000 UTC" firstStartedPulling="2026-02-24 00:29:42.908967233 +0000 UTC m=+1446.898591702" lastFinishedPulling="2026-02-24 00:29:51.173894816 +0000 UTC m=+1455.163519285" observedRunningTime="2026-02-24 00:29:52.129157348 +0000 UTC m=+1456.118781817" watchObservedRunningTime="2026-02-24 00:29:52.146390696 +0000 UTC m=+1456.136015165" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.156865 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9"] Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.158352 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.162325 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.164423 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.174472 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9"] Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.228948 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbwb6\" (UniqueName: \"kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.229025 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.229384 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.330772 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.330857 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbwb6\" (UniqueName: \"kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.330903 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.331993 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.340493 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.350213 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbwb6\" (UniqueName: \"kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6\") pod \"collect-profiles-29531550-fz9x9\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.482121 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:00 crc kubenswrapper[4824]: I0224 00:30:00.890216 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9"] Feb 24 00:30:00 crc kubenswrapper[4824]: W0224 00:30:00.898733 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46e264d0_74a1_43c2_8e9e_b66bf6a0ce87.slice/crio-0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45 WatchSource:0}: Error finding container 0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45: Status 404 returned error can't find the container with id 0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45 Feb 24 00:30:01 crc kubenswrapper[4824]: I0224 00:30:01.177312 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" event={"ID":"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87","Type":"ContainerStarted","Data":"e75a1f2ea58f5e7d7f03dabeedf0d261d871be3d42322aec43c9fe4075fab698"} Feb 24 00:30:01 crc kubenswrapper[4824]: I0224 00:30:01.177813 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" event={"ID":"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87","Type":"ContainerStarted","Data":"0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45"} Feb 24 00:30:02 crc kubenswrapper[4824]: I0224 00:30:02.185416 4824 generic.go:334] "Generic (PLEG): container finished" podID="46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" containerID="e75a1f2ea58f5e7d7f03dabeedf0d261d871be3d42322aec43c9fe4075fab698" exitCode=0 Feb 24 00:30:02 crc kubenswrapper[4824]: I0224 00:30:02.185492 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" event={"ID":"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87","Type":"ContainerDied","Data":"e75a1f2ea58f5e7d7f03dabeedf0d261d871be3d42322aec43c9fe4075fab698"} Feb 24 00:30:02 crc kubenswrapper[4824]: I0224 00:30:02.187315 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" event={"ID":"99d102db-b6a5-428f-acec-1311a225325d","Type":"ContainerStarted","Data":"26d94a9877cc40e64a79c3e06105e642b8b1ead1d8012838afe9b3fc9455b9d6"} Feb 24 00:30:02 crc kubenswrapper[4824]: I0224 00:30:02.233795 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-755b8777c-j59cx" podStartSLOduration=2.265260526 podStartE2EDuration="35.233773274s" podCreationTimestamp="2026-02-24 00:29:27 +0000 UTC" firstStartedPulling="2026-02-24 00:29:28.439594314 +0000 UTC m=+1432.429218783" lastFinishedPulling="2026-02-24 00:30:01.408107072 +0000 UTC m=+1465.397731531" observedRunningTime="2026-02-24 00:30:02.22840405 +0000 UTC m=+1466.218028529" watchObservedRunningTime="2026-02-24 00:30:02.233773274 +0000 UTC m=+1466.223397743" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.451535 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.482916 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbwb6\" (UniqueName: \"kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6\") pod \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.482988 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume\") pod \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.483036 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume\") pod \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\" (UID: \"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87\") " Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.484548 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume" (OuterVolumeSpecName: "config-volume") pod "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" (UID: "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.490720 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" (UID: "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.490819 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6" (OuterVolumeSpecName: "kube-api-access-gbwb6") pod "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" (UID: "46e264d0-74a1-43c2-8e9e-b66bf6a0ce87"). InnerVolumeSpecName "kube-api-access-gbwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.584893 4824 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.584968 4824 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 00:30:03 crc kubenswrapper[4824]: I0224 00:30:03.584980 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbwb6\" (UniqueName: \"kubernetes.io/projected/46e264d0-74a1-43c2-8e9e-b66bf6a0ce87-kube-api-access-gbwb6\") on node \"crc\" DevicePath \"\"" Feb 24 00:30:04 crc kubenswrapper[4824]: I0224 00:30:04.203133 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" event={"ID":"46e264d0-74a1-43c2-8e9e-b66bf6a0ce87","Type":"ContainerDied","Data":"0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45"} Feb 24 00:30:04 crc kubenswrapper[4824]: I0224 00:30:04.203627 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c32cd2f0d416c552424fda5d5b2d830dbf1d98715059a7c909099f2008ffb45" Feb 24 00:30:04 crc kubenswrapper[4824]: I0224 00:30:04.203396 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531550-fz9x9" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.227600 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:30:21 crc kubenswrapper[4824]: E0224 00:30:21.229016 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" containerName="collect-profiles" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.229037 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" containerName="collect-profiles" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.229231 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e264d0-74a1-43c2-8e9e-b66bf6a0ce87" containerName="collect-profiles" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.229956 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.234992 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.235854 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-d6d6z" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.235861 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.236203 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.236279 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.240393 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.242508 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.246293 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.265705 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.265959 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.266061 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.266143 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6lw6\" (UniqueName: \"kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.266218 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.266311 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.266390 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367408 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367469 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367501 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367541 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6lw6\" (UniqueName: \"kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367561 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367588 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.367608 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.369183 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.374662 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.374688 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.374768 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.375044 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.375952 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.388891 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6lw6\" (UniqueName: \"kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6\") pod \"default-interconnect-68864d46cb-c5wht\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.563938 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:30:21 crc kubenswrapper[4824]: I0224 00:30:21.792354 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:30:22 crc kubenswrapper[4824]: I0224 00:30:22.342453 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" event={"ID":"4ffe4e04-44ea-455a-8788-47d60605ed27","Type":"ContainerStarted","Data":"53601482595b879499df40be3a545ef10a2727ef3879843a1a926827995e2ff3"} Feb 24 00:30:28 crc kubenswrapper[4824]: I0224 00:30:28.399851 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" event={"ID":"4ffe4e04-44ea-455a-8788-47d60605ed27","Type":"ContainerStarted","Data":"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293"} Feb 24 00:30:28 crc kubenswrapper[4824]: I0224 00:30:28.425195 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" podStartSLOduration=1.8167469600000001 podStartE2EDuration="7.425174869s" podCreationTimestamp="2026-02-24 00:30:21 +0000 UTC" firstStartedPulling="2026-02-24 00:30:21.802496793 +0000 UTC m=+1485.792121272" lastFinishedPulling="2026-02-24 00:30:27.410924712 +0000 UTC m=+1491.400549181" observedRunningTime="2026-02-24 00:30:28.421148609 +0000 UTC m=+1492.410773158" watchObservedRunningTime="2026-02-24 00:30:28.425174869 +0000 UTC m=+1492.414799338" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.974708 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.977956 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981509 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-24dl4" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981734 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981549 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981549 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.982058 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.982113 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981557 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.982244 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.981566 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.984700 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Feb 24 00:30:32 crc kubenswrapper[4824]: I0224 00:30:32.997732 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.089912 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-web-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090024 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090049 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090085 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090109 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090131 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-tls-assets\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090302 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6m9w\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-kube-api-access-j6m9w\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090398 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53e02954-1178-4e89-ac80-46a476b99871\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53e02954-1178-4e89-ac80-46a476b99871\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090435 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090479 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090554 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config-out\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.090578 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.191589 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.191950 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.192094 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.192231 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.192320 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-tls-assets\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: E0224 00:30:33.192390 4824 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 24 00:30:33 crc kubenswrapper[4824]: E0224 00:30:33.192529 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls podName:d1d48ccf-0bde-4748-8128-1e82ca1f302a nodeName:}" failed. No retries permitted until 2026-02-24 00:30:33.69248547 +0000 UTC m=+1497.682109939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "d1d48ccf-0bde-4748-8128-1e82ca1f302a") : secret "default-prometheus-proxy-tls" not found Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.192917 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.192963 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.193182 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6m9w\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-kube-api-access-j6m9w\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.193281 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53e02954-1178-4e89-ac80-46a476b99871\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53e02954-1178-4e89-ac80-46a476b99871\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.193385 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.193998 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.194115 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config-out\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.194223 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.194818 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-web-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.194758 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.193952 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d1d48ccf-0bde-4748-8128-1e82ca1f302a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.196367 4824 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.196405 4824 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53e02954-1178-4e89-ac80-46a476b99871\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53e02954-1178-4e89-ac80-46a476b99871\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a26aa4f9152d04742e9ebd850c2e40da641d10c688b7a26b812dff5f76d22587/globalmount\"" pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.199301 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config-out\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.199890 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.200475 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.209092 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-tls-assets\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.216124 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-web-config\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.216854 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6m9w\" (UniqueName: \"kubernetes.io/projected/d1d48ccf-0bde-4748-8128-1e82ca1f302a-kube-api-access-j6m9w\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.233050 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53e02954-1178-4e89-ac80-46a476b99871\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53e02954-1178-4e89-ac80-46a476b99871\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: I0224 00:30:33.702207 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:33 crc kubenswrapper[4824]: E0224 00:30:33.702654 4824 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 24 00:30:33 crc kubenswrapper[4824]: E0224 00:30:33.702791 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls podName:d1d48ccf-0bde-4748-8128-1e82ca1f302a nodeName:}" failed. No retries permitted until 2026-02-24 00:30:34.702776527 +0000 UTC m=+1498.692400996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "d1d48ccf-0bde-4748-8128-1e82ca1f302a") : secret "default-prometheus-proxy-tls" not found Feb 24 00:30:34 crc kubenswrapper[4824]: I0224 00:30:34.726313 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:34 crc kubenswrapper[4824]: I0224 00:30:34.733868 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1d48ccf-0bde-4748-8128-1e82ca1f302a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"d1d48ccf-0bde-4748-8128-1e82ca1f302a\") " pod="service-telemetry/prometheus-default-0" Feb 24 00:30:34 crc kubenswrapper[4824]: I0224 00:30:34.799141 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 24 00:30:35 crc kubenswrapper[4824]: I0224 00:30:35.046758 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 24 00:30:35 crc kubenswrapper[4824]: I0224 00:30:35.452905 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerStarted","Data":"7f379d1da5002e7236ec0fc7c93d073b51bc805a4b867bef73a08afcaef11ced"} Feb 24 00:30:39 crc kubenswrapper[4824]: I0224 00:30:39.490011 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerStarted","Data":"4111657b5dc645e832383c623f1d92292f4ce6c616d9b987861005d1a8191449"} Feb 24 00:30:43 crc kubenswrapper[4824]: I0224 00:30:43.819007 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zxg6n"] Feb 24 00:30:43 crc kubenswrapper[4824]: I0224 00:30:43.820672 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" Feb 24 00:30:43 crc kubenswrapper[4824]: I0224 00:30:43.846375 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zxg6n"] Feb 24 00:30:43 crc kubenswrapper[4824]: I0224 00:30:43.973792 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbj4m\" (UniqueName: \"kubernetes.io/projected/13d35d6f-04c4-438a-bda9-ce9c4ed84b99-kube-api-access-hbj4m\") pod \"default-snmp-webhook-6856cfb745-zxg6n\" (UID: \"13d35d6f-04c4-438a-bda9-ce9c4ed84b99\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" Feb 24 00:30:44 crc kubenswrapper[4824]: I0224 00:30:44.074948 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbj4m\" (UniqueName: \"kubernetes.io/projected/13d35d6f-04c4-438a-bda9-ce9c4ed84b99-kube-api-access-hbj4m\") pod \"default-snmp-webhook-6856cfb745-zxg6n\" (UID: \"13d35d6f-04c4-438a-bda9-ce9c4ed84b99\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" Feb 24 00:30:44 crc kubenswrapper[4824]: I0224 00:30:44.102710 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbj4m\" (UniqueName: \"kubernetes.io/projected/13d35d6f-04c4-438a-bda9-ce9c4ed84b99-kube-api-access-hbj4m\") pod \"default-snmp-webhook-6856cfb745-zxg6n\" (UID: \"13d35d6f-04c4-438a-bda9-ce9c4ed84b99\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" Feb 24 00:30:44 crc kubenswrapper[4824]: I0224 00:30:44.143304 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" Feb 24 00:30:44 crc kubenswrapper[4824]: I0224 00:30:44.395905 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zxg6n"] Feb 24 00:30:44 crc kubenswrapper[4824]: I0224 00:30:44.534548 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" event={"ID":"13d35d6f-04c4-438a-bda9-ce9c4ed84b99","Type":"ContainerStarted","Data":"77c4391bb922438b22712263f7cf09c867bf940b570ddbbc748608267fd5072d"} Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.504153 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.507847 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.514805 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.517710 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.518035 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.518226 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-vxcg6" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.518361 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.518580 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.533743 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.557962 4824 generic.go:334] "Generic (PLEG): container finished" podID="d1d48ccf-0bde-4748-8128-1e82ca1f302a" containerID="4111657b5dc645e832383c623f1d92292f4ce6c616d9b987861005d1a8191449" exitCode=0 Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.558025 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerDied","Data":"4111657b5dc645e832383c623f1d92292f4ce6c616d9b987861005d1a8191449"} Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.635973 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-tls-assets\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636064 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636095 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-config-volume\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636233 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4916ffb-2e83-480a-a12f-ad04c6144517-config-out\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636439 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636641 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636694 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636754 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4s8v\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-kube-api-access-k4s8v\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.636867 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-web-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740465 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-tls-assets\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740602 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740619 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-config-volume\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740653 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4916ffb-2e83-480a-a12f-ad04c6144517-config-out\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740684 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740757 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740787 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740805 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4s8v\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-kube-api-access-k4s8v\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.740831 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-web-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: E0224 00:30:47.742827 4824 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:47 crc kubenswrapper[4824]: E0224 00:30:47.742902 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls podName:b4916ffb-2e83-480a-a12f-ad04c6144517 nodeName:}" failed. No retries permitted until 2026-02-24 00:30:48.242878191 +0000 UTC m=+1512.232502660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "b4916ffb-2e83-480a-a12f-ad04c6144517") : secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.747971 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749025 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-web-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749241 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749246 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4916ffb-2e83-480a-a12f-ad04c6144517-config-out\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749362 4824 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749397 4824 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6803be11a56e3609b692f5f0a005852564157744315862e6f0993faa3e20a471/globalmount\"" pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.749631 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-tls-assets\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.754550 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-config-volume\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.762724 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4s8v\" (UniqueName: \"kubernetes.io/projected/b4916ffb-2e83-480a-a12f-ad04c6144517-kube-api-access-k4s8v\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:47 crc kubenswrapper[4824]: I0224 00:30:47.781718 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-19750a56-7d5e-4445-b890-e73db1a4c7c1\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:48 crc kubenswrapper[4824]: I0224 00:30:48.248273 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:48 crc kubenswrapper[4824]: E0224 00:30:48.248482 4824 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:48 crc kubenswrapper[4824]: E0224 00:30:48.249112 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls podName:b4916ffb-2e83-480a-a12f-ad04c6144517 nodeName:}" failed. No retries permitted until 2026-02-24 00:30:49.249082267 +0000 UTC m=+1513.238706746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "b4916ffb-2e83-480a-a12f-ad04c6144517") : secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:49 crc kubenswrapper[4824]: I0224 00:30:49.268833 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:49 crc kubenswrapper[4824]: E0224 00:30:49.269278 4824 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:49 crc kubenswrapper[4824]: E0224 00:30:49.269370 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls podName:b4916ffb-2e83-480a-a12f-ad04c6144517 nodeName:}" failed. No retries permitted until 2026-02-24 00:30:51.269344443 +0000 UTC m=+1515.258968912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "b4916ffb-2e83-480a-a12f-ad04c6144517") : secret "default-alertmanager-proxy-tls" not found Feb 24 00:30:51 crc kubenswrapper[4824]: I0224 00:30:51.314130 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:51 crc kubenswrapper[4824]: I0224 00:30:51.321153 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4916ffb-2e83-480a-a12f-ad04c6144517-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"b4916ffb-2e83-480a-a12f-ad04c6144517\") " pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:51 crc kubenswrapper[4824]: I0224 00:30:51.435853 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 24 00:30:52 crc kubenswrapper[4824]: I0224 00:30:52.337944 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 24 00:30:52 crc kubenswrapper[4824]: W0224 00:30:52.513549 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4916ffb_2e83_480a_a12f_ad04c6144517.slice/crio-d55f176c94244b485ddbdd036af985cce55607cb698647513297663dfc60fea0 WatchSource:0}: Error finding container d55f176c94244b485ddbdd036af985cce55607cb698647513297663dfc60fea0: Status 404 returned error can't find the container with id d55f176c94244b485ddbdd036af985cce55607cb698647513297663dfc60fea0 Feb 24 00:30:52 crc kubenswrapper[4824]: I0224 00:30:52.598658 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerStarted","Data":"d55f176c94244b485ddbdd036af985cce55607cb698647513297663dfc60fea0"} Feb 24 00:30:53 crc kubenswrapper[4824]: I0224 00:30:53.276796 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:30:53 crc kubenswrapper[4824]: I0224 00:30:53.277369 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:30:53 crc kubenswrapper[4824]: I0224 00:30:53.610310 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" event={"ID":"13d35d6f-04c4-438a-bda9-ce9c4ed84b99","Type":"ContainerStarted","Data":"81ee21fad53c64cdb64601823b0c6addb7dd8fe758dbfb38eaa90c6b721ee15b"} Feb 24 00:30:53 crc kubenswrapper[4824]: I0224 00:30:53.638327 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-zxg6n" podStartSLOduration=2.536679472 podStartE2EDuration="10.638292778s" podCreationTimestamp="2026-02-24 00:30:43 +0000 UTC" firstStartedPulling="2026-02-24 00:30:44.402763254 +0000 UTC m=+1508.392387723" lastFinishedPulling="2026-02-24 00:30:52.50437654 +0000 UTC m=+1516.494001029" observedRunningTime="2026-02-24 00:30:53.630272859 +0000 UTC m=+1517.619897328" watchObservedRunningTime="2026-02-24 00:30:53.638292778 +0000 UTC m=+1517.627917257" Feb 24 00:30:55 crc kubenswrapper[4824]: I0224 00:30:55.630791 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerStarted","Data":"4b9764061f062eec1646f91b5f45f22196cef8acdac4c333d92f1ae0bb6b9c29"} Feb 24 00:30:56 crc kubenswrapper[4824]: I0224 00:30:56.638715 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerStarted","Data":"eefc6044002b003b5b2092d6f6bd05b9c3c8779dc4be438198ee04599e837ded"} Feb 24 00:30:58 crc kubenswrapper[4824]: I0224 00:30:58.656142 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerStarted","Data":"b59498721549b7fd304ac678e60b55112664770494c924a17b71716353906e30"} Feb 24 00:31:01 crc kubenswrapper[4824]: I0224 00:31:01.680955 4824 generic.go:334] "Generic (PLEG): container finished" podID="b4916ffb-2e83-480a-a12f-ad04c6144517" containerID="4b9764061f062eec1646f91b5f45f22196cef8acdac4c333d92f1ae0bb6b9c29" exitCode=0 Feb 24 00:31:01 crc kubenswrapper[4824]: I0224 00:31:01.681329 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerDied","Data":"4b9764061f062eec1646f91b5f45f22196cef8acdac4c333d92f1ae0bb6b9c29"} Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.707749 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw"] Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.710183 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.724385 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-cqjg5" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.724878 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.724925 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.725222 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.729302 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw"] Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.817167 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.817298 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.817327 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6rv\" (UniqueName: \"kubernetes.io/projected/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-kube-api-access-mr6rv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.817423 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.817796 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.919292 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.919616 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.919651 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.919668 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6rv\" (UniqueName: \"kubernetes.io/projected/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-kube-api-access-mr6rv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.919693 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: E0224 00:31:02.919923 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 24 00:31:02 crc kubenswrapper[4824]: E0224 00:31:02.920051 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls podName:b1b6fe19-ad2f-490e-80dc-39ed80de85b3 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:03.420020012 +0000 UTC m=+1527.409644481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" (UID: "b1b6fe19-ad2f-490e-80dc-39ed80de85b3") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.920492 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.920956 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.931135 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:02 crc kubenswrapper[4824]: I0224 00:31:02.938369 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6rv\" (UniqueName: \"kubernetes.io/projected/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-kube-api-access-mr6rv\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:03 crc kubenswrapper[4824]: I0224 00:31:03.429023 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:03 crc kubenswrapper[4824]: E0224 00:31:03.429233 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 24 00:31:03 crc kubenswrapper[4824]: E0224 00:31:03.429349 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls podName:b1b6fe19-ad2f-490e-80dc-39ed80de85b3 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:04.429322255 +0000 UTC m=+1528.418946724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" (UID: "b1b6fe19-ad2f-490e-80dc-39ed80de85b3") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 24 00:31:04 crc kubenswrapper[4824]: I0224 00:31:04.445872 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:04 crc kubenswrapper[4824]: I0224 00:31:04.451255 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1b6fe19-ad2f-490e-80dc-39ed80de85b3-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw\" (UID: \"b1b6fe19-ad2f-490e-80dc-39ed80de85b3\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:04 crc kubenswrapper[4824]: I0224 00:31:04.535991 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" Feb 24 00:31:05 crc kubenswrapper[4824]: I0224 00:31:05.146491 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw"] Feb 24 00:31:05 crc kubenswrapper[4824]: W0224 00:31:05.308693 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1b6fe19_ad2f_490e_80dc_39ed80de85b3.slice/crio-614cf64db079533bc1cd7cc7a4fe56b9e55b2adc095beb655ac8a1fde730548e WatchSource:0}: Error finding container 614cf64db079533bc1cd7cc7a4fe56b9e55b2adc095beb655ac8a1fde730548e: Status 404 returned error can't find the container with id 614cf64db079533bc1cd7cc7a4fe56b9e55b2adc095beb655ac8a1fde730548e Feb 24 00:31:05 crc kubenswrapper[4824]: I0224 00:31:05.716127 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"d1d48ccf-0bde-4748-8128-1e82ca1f302a","Type":"ContainerStarted","Data":"7174819ab5b2a05572d6604bef6b3119a5c5bbd4ceffd358045569d1650a9930"} Feb 24 00:31:05 crc kubenswrapper[4824]: I0224 00:31:05.723622 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"614cf64db079533bc1cd7cc7a4fe56b9e55b2adc095beb655ac8a1fde730548e"} Feb 24 00:31:05 crc kubenswrapper[4824]: I0224 00:31:05.742265 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.049294553 podStartE2EDuration="34.742243414s" podCreationTimestamp="2026-02-24 00:30:31 +0000 UTC" firstStartedPulling="2026-02-24 00:30:35.044645724 +0000 UTC m=+1499.034270183" lastFinishedPulling="2026-02-24 00:31:04.737594575 +0000 UTC m=+1528.727219044" observedRunningTime="2026-02-24 00:31:05.741375192 +0000 UTC m=+1529.730999661" watchObservedRunningTime="2026-02-24 00:31:05.742243414 +0000 UTC m=+1529.731867873" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.133013 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw"] Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.135006 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.138392 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.138392 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.145051 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw"] Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.274440 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.274544 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/99b264a5-5103-4445-8978-942c71208377-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.274583 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.274607 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b264a5-5103-4445-8978-942c71208377-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.274636 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssscv\" (UniqueName: \"kubernetes.io/projected/99b264a5-5103-4445-8978-942c71208377-kube-api-access-ssscv\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.377143 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.378501 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/99b264a5-5103-4445-8978-942c71208377-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.378571 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.378604 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b264a5-5103-4445-8978-942c71208377-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.378631 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssscv\" (UniqueName: \"kubernetes.io/projected/99b264a5-5103-4445-8978-942c71208377-kube-api-access-ssscv\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.379443 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b264a5-5103-4445-8978-942c71208377-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: E0224 00:31:06.379698 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 24 00:31:06 crc kubenswrapper[4824]: E0224 00:31:06.380279 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls podName:99b264a5-5103-4445-8978-942c71208377 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:06.879782512 +0000 UTC m=+1530.869406981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" (UID: "99b264a5-5103-4445-8978-942c71208377") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.380305 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/99b264a5-5103-4445-8978-942c71208377-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.388185 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.396589 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssscv\" (UniqueName: \"kubernetes.io/projected/99b264a5-5103-4445-8978-942c71208377-kube-api-access-ssscv\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.733280 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerStarted","Data":"1cfc2b2dc30bbc7d2befd25ab12a19debefea13d4b2fca65ae59d07f84b29844"} Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.736887 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"3e5b1e8ee30bcdfa03bd25bd368f1687f69c352c8d4634bbd7cd8c28f9bb4e5f"} Feb 24 00:31:06 crc kubenswrapper[4824]: I0224 00:31:06.885959 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:06 crc kubenswrapper[4824]: E0224 00:31:06.886800 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 24 00:31:06 crc kubenswrapper[4824]: E0224 00:31:06.886879 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls podName:99b264a5-5103-4445-8978-942c71208377 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:07.886857949 +0000 UTC m=+1531.876482418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" (UID: "99b264a5-5103-4445-8978-942c71208377") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 24 00:31:07 crc kubenswrapper[4824]: I0224 00:31:07.746427 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"4cf792676db082f4deec5a1cc01c8abbce90a283e92e03176c0a5e3f90a5aa8c"} Feb 24 00:31:07 crc kubenswrapper[4824]: I0224 00:31:07.911234 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:07 crc kubenswrapper[4824]: I0224 00:31:07.921776 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/99b264a5-5103-4445-8978-942c71208377-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw\" (UID: \"99b264a5-5103-4445-8978-942c71208377\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:07 crc kubenswrapper[4824]: I0224 00:31:07.961651 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" Feb 24 00:31:08 crc kubenswrapper[4824]: I0224 00:31:08.545632 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw"] Feb 24 00:31:08 crc kubenswrapper[4824]: W0224 00:31:08.555719 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b264a5_5103_4445_8978_942c71208377.slice/crio-7de16a116602e343519bc1cd05669beb311e65cc5f4b48cc665c5014e2053893 WatchSource:0}: Error finding container 7de16a116602e343519bc1cd05669beb311e65cc5f4b48cc665c5014e2053893: Status 404 returned error can't find the container with id 7de16a116602e343519bc1cd05669beb311e65cc5f4b48cc665c5014e2053893 Feb 24 00:31:08 crc kubenswrapper[4824]: I0224 00:31:08.757061 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerStarted","Data":"53f1e37d750e2126213d8e20208c38aef9ce0ac22c9b625149e85917d94dbf92"} Feb 24 00:31:08 crc kubenswrapper[4824]: I0224 00:31:08.760501 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"7de16a116602e343519bc1cd05669beb311e65cc5f4b48cc665c5014e2053893"} Feb 24 00:31:09 crc kubenswrapper[4824]: I0224 00:31:09.799807 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.233276 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84"] Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.241011 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.246088 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.246221 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.251181 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84"] Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.355843 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/25d3b43f-0bff-44ca-83f4-b8a0052cd764-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.355915 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3b43f-0bff-44ca-83f4-b8a0052cd764-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.356084 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.356248 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb569\" (UniqueName: \"kubernetes.io/projected/25d3b43f-0bff-44ca-83f4-b8a0052cd764-kube-api-access-gb569\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.356277 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458157 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458265 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb569\" (UniqueName: \"kubernetes.io/projected/25d3b43f-0bff-44ca-83f4-b8a0052cd764-kube-api-access-gb569\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458323 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458388 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/25d3b43f-0bff-44ca-83f4-b8a0052cd764-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458430 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3b43f-0bff-44ca-83f4-b8a0052cd764-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: E0224 00:31:10.458937 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.458980 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/25d3b43f-0bff-44ca-83f4-b8a0052cd764-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: E0224 00:31:10.459035 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls podName:25d3b43f-0bff-44ca-83f4-b8a0052cd764 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:10.959005991 +0000 UTC m=+1534.948630470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" (UID: "25d3b43f-0bff-44ca-83f4-b8a0052cd764") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.462106 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/25d3b43f-0bff-44ca-83f4-b8a0052cd764-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.472074 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.480685 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb569\" (UniqueName: \"kubernetes.io/projected/25d3b43f-0bff-44ca-83f4-b8a0052cd764-kube-api-access-gb569\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: I0224 00:31:10.966098 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:10 crc kubenswrapper[4824]: E0224 00:31:10.966328 4824 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 24 00:31:10 crc kubenswrapper[4824]: E0224 00:31:10.966400 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls podName:25d3b43f-0bff-44ca-83f4-b8a0052cd764 nodeName:}" failed. No retries permitted until 2026-02-24 00:31:11.966379565 +0000 UTC m=+1535.956004034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" (UID: "25d3b43f-0bff-44ca-83f4-b8a0052cd764") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 24 00:31:11 crc kubenswrapper[4824]: I0224 00:31:11.984351 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.002794 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/25d3b43f-0bff-44ca-83f4-b8a0052cd764-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84\" (UID: \"25d3b43f-0bff-44ca-83f4-b8a0052cd764\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.074137 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.567924 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84"] Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.804601 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"9f0bb9f941494073f43473a4ed1758329780d03e5d150223a5f35671ee6f55e9"} Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.807165 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"1f54259e5730b34a2560d4dc934811944ebe1241cd69840a98777fa2d1414c52"} Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.809483 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"b4916ffb-2e83-480a-a12f-ad04c6144517","Type":"ContainerStarted","Data":"9bd11c389c536cfcd9cd6ffcc50ee95ec727476440d2570fe4ce41006cf9b49e"} Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.812132 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"46e1822d839098966fca9d2fe9fd88f95716adc7230f55dbadcd96a250ec0b01"} Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.812165 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"cc2f4bc7d66d0e52e1f46fb202f9d99c81667e99fccded51f463888fd413644e"} Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.825377 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" podStartSLOduration=4.073564784 podStartE2EDuration="10.825355807s" podCreationTimestamp="2026-02-24 00:31:02 +0000 UTC" firstStartedPulling="2026-02-24 00:31:05.319618814 +0000 UTC m=+1529.309243283" lastFinishedPulling="2026-02-24 00:31:12.071409837 +0000 UTC m=+1536.061034306" observedRunningTime="2026-02-24 00:31:12.823682835 +0000 UTC m=+1536.813307304" watchObservedRunningTime="2026-02-24 00:31:12.825355807 +0000 UTC m=+1536.814980276" Feb 24 00:31:12 crc kubenswrapper[4824]: I0224 00:31:12.868269 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.483218921 podStartE2EDuration="26.868237292s" podCreationTimestamp="2026-02-24 00:30:46 +0000 UTC" firstStartedPulling="2026-02-24 00:31:01.684087498 +0000 UTC m=+1525.673711967" lastFinishedPulling="2026-02-24 00:31:12.069105869 +0000 UTC m=+1536.058730338" observedRunningTime="2026-02-24 00:31:12.867204417 +0000 UTC m=+1536.856828896" watchObservedRunningTime="2026-02-24 00:31:12.868237292 +0000 UTC m=+1536.857861771" Feb 24 00:31:13 crc kubenswrapper[4824]: I0224 00:31:13.826676 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"0501ae22fa95c619ea2d8a7be3462d1ccb662e4fd126eb9acfa5460631d820d1"} Feb 24 00:31:13 crc kubenswrapper[4824]: I0224 00:31:13.826756 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"a300490310e1c3ebb42c85c5a7fa61913de98e5750b6b6f122bc2e555fca562a"} Feb 24 00:31:13 crc kubenswrapper[4824]: I0224 00:31:13.826774 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"57ab9185344c6438c10648348978fb546b25c744a42bd3170859d1b119ae7996"} Feb 24 00:31:13 crc kubenswrapper[4824]: I0224 00:31:13.837968 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"67b48a5c690377762558d40a05b5c8223d7b5f2dbb2757400994545f20b3efeb"} Feb 24 00:31:13 crc kubenswrapper[4824]: I0224 00:31:13.848797 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" podStartSLOduration=2.784867101 podStartE2EDuration="3.848772662s" podCreationTimestamp="2026-02-24 00:31:10 +0000 UTC" firstStartedPulling="2026-02-24 00:31:12.59282356 +0000 UTC m=+1536.582448029" lastFinishedPulling="2026-02-24 00:31:13.656729131 +0000 UTC m=+1537.646353590" observedRunningTime="2026-02-24 00:31:13.847304805 +0000 UTC m=+1537.836929284" watchObservedRunningTime="2026-02-24 00:31:13.848772662 +0000 UTC m=+1537.838397131" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.279193 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" podStartSLOduration=8.081729958 podStartE2EDuration="12.279165874s" podCreationTimestamp="2026-02-24 00:31:06 +0000 UTC" firstStartedPulling="2026-02-24 00:31:08.561412349 +0000 UTC m=+1532.551036818" lastFinishedPulling="2026-02-24 00:31:12.758848265 +0000 UTC m=+1536.748472734" observedRunningTime="2026-02-24 00:31:13.879370692 +0000 UTC m=+1537.868995181" watchObservedRunningTime="2026-02-24 00:31:18.279165874 +0000 UTC m=+1542.268790353" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.286007 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt"] Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.287628 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.290856 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.305023 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.306894 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt"] Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.408001 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/77c39bbc-adcc-40f9-afe2-9d97f93262b9-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.408399 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxlg\" (UniqueName: \"kubernetes.io/projected/77c39bbc-adcc-40f9-afe2-9d97f93262b9-kube-api-access-fqxlg\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.408620 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/77c39bbc-adcc-40f9-afe2-9d97f93262b9-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.408936 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/77c39bbc-adcc-40f9-afe2-9d97f93262b9-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.510642 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/77c39bbc-adcc-40f9-afe2-9d97f93262b9-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.510713 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxlg\" (UniqueName: \"kubernetes.io/projected/77c39bbc-adcc-40f9-afe2-9d97f93262b9-kube-api-access-fqxlg\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.510762 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/77c39bbc-adcc-40f9-afe2-9d97f93262b9-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.510834 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/77c39bbc-adcc-40f9-afe2-9d97f93262b9-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.511761 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/77c39bbc-adcc-40f9-afe2-9d97f93262b9-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.512257 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/77c39bbc-adcc-40f9-afe2-9d97f93262b9-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.524690 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/77c39bbc-adcc-40f9-afe2-9d97f93262b9-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.530161 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxlg\" (UniqueName: \"kubernetes.io/projected/77c39bbc-adcc-40f9-afe2-9d97f93262b9-kube-api-access-fqxlg\") pod \"default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt\" (UID: \"77c39bbc-adcc-40f9-afe2-9d97f93262b9\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:18 crc kubenswrapper[4824]: I0224 00:31:18.623840 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.105535 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt"] Feb 24 00:31:19 crc kubenswrapper[4824]: W0224 00:31:19.108839 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77c39bbc_adcc_40f9_afe2_9d97f93262b9.slice/crio-e34c44e7de1ec67024a6591049eeba99e994bf04ba4730c05dd2dfad7d208d96 WatchSource:0}: Error finding container e34c44e7de1ec67024a6591049eeba99e994bf04ba4730c05dd2dfad7d208d96: Status 404 returned error can't find the container with id e34c44e7de1ec67024a6591049eeba99e994bf04ba4730c05dd2dfad7d208d96 Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.799835 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.848483 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.860613 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc"] Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.869778 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.883465 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc"] Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.884714 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.925330 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerStarted","Data":"e34c44e7de1ec67024a6591049eeba99e994bf04ba4730c05dd2dfad7d208d96"} Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.937819 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/19cb1d3e-5363-406a-a5f4-ecfe04edd347-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.937969 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/19cb1d3e-5363-406a-a5f4-ecfe04edd347-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.938121 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwk9\" (UniqueName: \"kubernetes.io/projected/19cb1d3e-5363-406a-a5f4-ecfe04edd347-kube-api-access-gxwk9\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.938175 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/19cb1d3e-5363-406a-a5f4-ecfe04edd347-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:19 crc kubenswrapper[4824]: I0224 00:31:19.974997 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.039820 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/19cb1d3e-5363-406a-a5f4-ecfe04edd347-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.041005 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwk9\" (UniqueName: \"kubernetes.io/projected/19cb1d3e-5363-406a-a5f4-ecfe04edd347-kube-api-access-gxwk9\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.041087 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/19cb1d3e-5363-406a-a5f4-ecfe04edd347-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.041237 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/19cb1d3e-5363-406a-a5f4-ecfe04edd347-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.042072 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/19cb1d3e-5363-406a-a5f4-ecfe04edd347-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.042241 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/19cb1d3e-5363-406a-a5f4-ecfe04edd347-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.059992 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/19cb1d3e-5363-406a-a5f4-ecfe04edd347-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.070991 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwk9\" (UniqueName: \"kubernetes.io/projected/19cb1d3e-5363-406a-a5f4-ecfe04edd347-kube-api-access-gxwk9\") pod \"default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc\" (UID: \"19cb1d3e-5363-406a-a5f4-ecfe04edd347\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.257389 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.615123 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc"] Feb 24 00:31:20 crc kubenswrapper[4824]: W0224 00:31:20.630390 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19cb1d3e_5363_406a_a5f4_ecfe04edd347.slice/crio-9ff67cab7f3120f0523b4e0ad4234822c8511472e1112e376dcc9c742e509cd6 WatchSource:0}: Error finding container 9ff67cab7f3120f0523b4e0ad4234822c8511472e1112e376dcc9c742e509cd6: Status 404 returned error can't find the container with id 9ff67cab7f3120f0523b4e0ad4234822c8511472e1112e376dcc9c742e509cd6 Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.933010 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerStarted","Data":"5d52767b4ff2eefa3a301cf99e13e98ce67da689c9800b168e08d2bdb25f9f50"} Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.933528 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerStarted","Data":"9ff67cab7f3120f0523b4e0ad4234822c8511472e1112e376dcc9c742e509cd6"} Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.936571 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerStarted","Data":"be8a0df6436c28c142600bd78cd1d720ab2f7272de54afd3e686be824d339778"} Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.936661 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerStarted","Data":"c4a8fb8018686ea387d812e3112b01226a43b249311b6283fbdc21bb7472950f"} Feb 24 00:31:20 crc kubenswrapper[4824]: I0224 00:31:20.967015 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" podStartSLOduration=1.883486349 podStartE2EDuration="2.966975196s" podCreationTimestamp="2026-02-24 00:31:18 +0000 UTC" firstStartedPulling="2026-02-24 00:31:19.112056395 +0000 UTC m=+1543.101680864" lastFinishedPulling="2026-02-24 00:31:20.195545242 +0000 UTC m=+1544.185169711" observedRunningTime="2026-02-24 00:31:20.961178032 +0000 UTC m=+1544.950802521" watchObservedRunningTime="2026-02-24 00:31:20.966975196 +0000 UTC m=+1544.956599675" Feb 24 00:31:21 crc kubenswrapper[4824]: I0224 00:31:21.950111 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerStarted","Data":"80ace817a383f22df03d441b3f0d005d9ec376e63a99860d577018e17a1d784e"} Feb 24 00:31:21 crc kubenswrapper[4824]: I0224 00:31:21.978550 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" podStartSLOduration=2.5804589570000003 podStartE2EDuration="2.978508656s" podCreationTimestamp="2026-02-24 00:31:19 +0000 UTC" firstStartedPulling="2026-02-24 00:31:20.633900362 +0000 UTC m=+1544.623524831" lastFinishedPulling="2026-02-24 00:31:21.031950071 +0000 UTC m=+1545.021574530" observedRunningTime="2026-02-24 00:31:21.976567578 +0000 UTC m=+1545.966192057" watchObservedRunningTime="2026-02-24 00:31:21.978508656 +0000 UTC m=+1545.968133125" Feb 24 00:31:23 crc kubenswrapper[4824]: I0224 00:31:23.276087 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:31:23 crc kubenswrapper[4824]: I0224 00:31:23.277363 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:31:33 crc kubenswrapper[4824]: I0224 00:31:33.463207 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:31:33 crc kubenswrapper[4824]: I0224 00:31:33.464469 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" podUID="4ffe4e04-44ea-455a-8788-47d60605ed27" containerName="default-interconnect" containerID="cri-o://29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293" gracePeriod=30 Feb 24 00:31:33 crc kubenswrapper[4824]: I0224 00:31:33.888079 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.050107 4824 generic.go:334] "Generic (PLEG): container finished" podID="19cb1d3e-5363-406a-a5f4-ecfe04edd347" containerID="5d52767b4ff2eefa3a301cf99e13e98ce67da689c9800b168e08d2bdb25f9f50" exitCode=0 Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.050201 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerDied","Data":"5d52767b4ff2eefa3a301cf99e13e98ce67da689c9800b168e08d2bdb25f9f50"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.051127 4824 scope.go:117] "RemoveContainer" containerID="5d52767b4ff2eefa3a301cf99e13e98ce67da689c9800b168e08d2bdb25f9f50" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.053071 4824 generic.go:334] "Generic (PLEG): container finished" podID="b1b6fe19-ad2f-490e-80dc-39ed80de85b3" containerID="4cf792676db082f4deec5a1cc01c8abbce90a283e92e03176c0a5e3f90a5aa8c" exitCode=0 Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.053118 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerDied","Data":"4cf792676db082f4deec5a1cc01c8abbce90a283e92e03176c0a5e3f90a5aa8c"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.054036 4824 scope.go:117] "RemoveContainer" containerID="4cf792676db082f4deec5a1cc01c8abbce90a283e92e03176c0a5e3f90a5aa8c" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.054929 4824 generic.go:334] "Generic (PLEG): container finished" podID="4ffe4e04-44ea-455a-8788-47d60605ed27" containerID="29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293" exitCode=0 Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.055052 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.055191 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" event={"ID":"4ffe4e04-44ea-455a-8788-47d60605ed27","Type":"ContainerDied","Data":"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.055312 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-c5wht" event={"ID":"4ffe4e04-44ea-455a-8788-47d60605ed27","Type":"ContainerDied","Data":"53601482595b879499df40be3a545ef10a2727ef3879843a1a926827995e2ff3"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.055333 4824 scope.go:117] "RemoveContainer" containerID="29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.063641 4824 generic.go:334] "Generic (PLEG): container finished" podID="25d3b43f-0bff-44ca-83f4-b8a0052cd764" containerID="a300490310e1c3ebb42c85c5a7fa61913de98e5750b6b6f122bc2e555fca562a" exitCode=0 Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.063723 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerDied","Data":"a300490310e1c3ebb42c85c5a7fa61913de98e5750b6b6f122bc2e555fca562a"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.064577 4824 scope.go:117] "RemoveContainer" containerID="a300490310e1c3ebb42c85c5a7fa61913de98e5750b6b6f122bc2e555fca562a" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.077943 4824 generic.go:334] "Generic (PLEG): container finished" podID="99b264a5-5103-4445-8978-942c71208377" containerID="46e1822d839098966fca9d2fe9fd88f95716adc7230f55dbadcd96a250ec0b01" exitCode=0 Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.078010 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerDied","Data":"46e1822d839098966fca9d2fe9fd88f95716adc7230f55dbadcd96a250ec0b01"} Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.078840 4824 scope.go:117] "RemoveContainer" containerID="46e1822d839098966fca9d2fe9fd88f95716adc7230f55dbadcd96a250ec0b01" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.084788 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6lw6\" (UniqueName: \"kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.084864 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.084925 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.084983 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.085034 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.085084 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.085133 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials\") pod \"4ffe4e04-44ea-455a-8788-47d60605ed27\" (UID: \"4ffe4e04-44ea-455a-8788-47d60605ed27\") " Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.089893 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.090678 4824 scope.go:117] "RemoveContainer" containerID="29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293" Feb 24 00:31:34 crc kubenswrapper[4824]: E0224 00:31:34.092678 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293\": container with ID starting with 29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293 not found: ID does not exist" containerID="29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.092803 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293"} err="failed to get container status \"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293\": rpc error: code = NotFound desc = could not find container \"29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293\": container with ID starting with 29bbfa19ec9b4d4d9603b53deb3db8ea412ca063873b7a0aadd1b1ae2c93f293 not found: ID does not exist" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.093692 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.103269 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.103421 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.104838 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6" (OuterVolumeSpecName: "kube-api-access-s6lw6") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "kube-api-access-s6lw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.109286 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.122158 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "4ffe4e04-44ea-455a-8788-47d60605ed27" (UID: "4ffe4e04-44ea-455a-8788-47d60605ed27"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187258 4824 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187297 4824 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187309 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6lw6\" (UniqueName: \"kubernetes.io/projected/4ffe4e04-44ea-455a-8788-47d60605ed27-kube-api-access-s6lw6\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187339 4824 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-sasl-users\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187349 4824 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187362 4824 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.187373 4824 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/4ffe4e04-44ea-455a-8788-47d60605ed27-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.405953 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.406533 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-c5wht"] Feb 24 00:31:34 crc kubenswrapper[4824]: I0224 00:31:34.708295 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffe4e04-44ea-455a-8788-47d60605ed27" path="/var/lib/kubelet/pods/4ffe4e04-44ea-455a-8788-47d60605ed27/volumes" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.094759 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"580e4473703a9287d77dfa0282262b828fdf928feb45462d2929a259ad5db38e"} Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.115840 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-76s9x"] Feb 24 00:31:35 crc kubenswrapper[4824]: E0224 00:31:35.116485 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffe4e04-44ea-455a-8788-47d60605ed27" containerName="default-interconnect" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.116509 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffe4e04-44ea-455a-8788-47d60605ed27" containerName="default-interconnect" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.117058 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffe4e04-44ea-455a-8788-47d60605ed27" containerName="default-interconnect" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.118425 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"895f0c841b08d7b22272f1ad5924dc1a69f338d44d989cee670afa4ce10c5b44"} Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.120907 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.125226 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.125483 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.125687 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-76s9x"] Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.125719 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.126775 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerStarted","Data":"eeddf94c39afeaf91d3aad6e7e7b037e372c6ac5b7c06815ce6a33281b134212"} Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.128434 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-d6d6z" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.130136 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.130177 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.130477 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.163080 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"9daa99339d0f6cbad236603de0bc2f34e81e039ff7ee664ab10491e20bc297e0"} Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.201297 4824 generic.go:334] "Generic (PLEG): container finished" podID="77c39bbc-adcc-40f9-afe2-9d97f93262b9" containerID="c4a8fb8018686ea387d812e3112b01226a43b249311b6283fbdc21bb7472950f" exitCode=0 Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.201349 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerDied","Data":"c4a8fb8018686ea387d812e3112b01226a43b249311b6283fbdc21bb7472950f"} Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.202025 4824 scope.go:117] "RemoveContainer" containerID="c4a8fb8018686ea387d812e3112b01226a43b249311b6283fbdc21bb7472950f" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225435 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225534 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225576 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-users\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225680 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225755 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-config\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225888 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fxr9\" (UniqueName: \"kubernetes.io/projected/58055cab-656a-46f2-a3e6-ab76d8943362-kube-api-access-7fxr9\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.225924 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.327613 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.327738 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-config\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.327804 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fxr9\" (UniqueName: \"kubernetes.io/projected/58055cab-656a-46f2-a3e6-ab76d8943362-kube-api-access-7fxr9\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.328244 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.328687 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.329016 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-config\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.329491 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.329563 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-users\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.337846 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.350420 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.350452 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fxr9\" (UniqueName: \"kubernetes.io/projected/58055cab-656a-46f2-a3e6-ab76d8943362-kube-api-access-7fxr9\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.351460 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.357093 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.358069 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/58055cab-656a-46f2-a3e6-ab76d8943362-sasl-users\") pod \"default-interconnect-68864d46cb-76s9x\" (UID: \"58055cab-656a-46f2-a3e6-ab76d8943362\") " pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.455177 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-76s9x" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.741780 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-76s9x"] Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.778359 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.780223 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.792570 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.801247 4824 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.801458 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.940205 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/ac546686-8945-46b8-8577-da344c7517bd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.940271 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsvq\" (UniqueName: \"kubernetes.io/projected/ac546686-8945-46b8-8577-da344c7517bd-kube-api-access-bgsvq\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:35 crc kubenswrapper[4824]: I0224 00:31:35.940318 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/ac546686-8945-46b8-8577-da344c7517bd-qdr-test-config\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.041832 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/ac546686-8945-46b8-8577-da344c7517bd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.041905 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsvq\" (UniqueName: \"kubernetes.io/projected/ac546686-8945-46b8-8577-da344c7517bd-kube-api-access-bgsvq\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.041960 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/ac546686-8945-46b8-8577-da344c7517bd-qdr-test-config\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.042909 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/ac546686-8945-46b8-8577-da344c7517bd-qdr-test-config\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.049089 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/ac546686-8945-46b8-8577-da344c7517bd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.063159 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsvq\" (UniqueName: \"kubernetes.io/projected/ac546686-8945-46b8-8577-da344c7517bd-kube-api-access-bgsvq\") pod \"qdr-test\" (UID: \"ac546686-8945-46b8-8577-da344c7517bd\") " pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.114454 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.214224 4824 generic.go:334] "Generic (PLEG): container finished" podID="25d3b43f-0bff-44ca-83f4-b8a0052cd764" containerID="580e4473703a9287d77dfa0282262b828fdf928feb45462d2929a259ad5db38e" exitCode=0 Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.214319 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerDied","Data":"580e4473703a9287d77dfa0282262b828fdf928feb45462d2929a259ad5db38e"} Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.214375 4824 scope.go:117] "RemoveContainer" containerID="a300490310e1c3ebb42c85c5a7fa61913de98e5750b6b6f122bc2e555fca562a" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.215255 4824 scope.go:117] "RemoveContainer" containerID="580e4473703a9287d77dfa0282262b828fdf928feb45462d2929a259ad5db38e" Feb 24 00:31:36 crc kubenswrapper[4824]: E0224 00:31:36.215562 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84_service-telemetry(25d3b43f-0bff-44ca-83f4-b8a0052cd764)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" podUID="25d3b43f-0bff-44ca-83f4-b8a0052cd764" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.218066 4824 generic.go:334] "Generic (PLEG): container finished" podID="b1b6fe19-ad2f-490e-80dc-39ed80de85b3" containerID="9daa99339d0f6cbad236603de0bc2f34e81e039ff7ee664ab10491e20bc297e0" exitCode=0 Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.218154 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerDied","Data":"9daa99339d0f6cbad236603de0bc2f34e81e039ff7ee664ab10491e20bc297e0"} Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.219106 4824 scope.go:117] "RemoveContainer" containerID="9daa99339d0f6cbad236603de0bc2f34e81e039ff7ee664ab10491e20bc297e0" Feb 24 00:31:36 crc kubenswrapper[4824]: E0224 00:31:36.219368 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw_service-telemetry(b1b6fe19-ad2f-490e-80dc-39ed80de85b3)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" podUID="b1b6fe19-ad2f-490e-80dc-39ed80de85b3" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.222095 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-76s9x" event={"ID":"58055cab-656a-46f2-a3e6-ab76d8943362","Type":"ContainerStarted","Data":"9d91402f05c5c2c741c970ba847c0d642086a04954d84feb7a08ab20acbd6174"} Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.227452 4824 generic.go:334] "Generic (PLEG): container finished" podID="99b264a5-5103-4445-8978-942c71208377" containerID="895f0c841b08d7b22272f1ad5924dc1a69f338d44d989cee670afa4ce10c5b44" exitCode=0 Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.227582 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerDied","Data":"895f0c841b08d7b22272f1ad5924dc1a69f338d44d989cee670afa4ce10c5b44"} Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.230501 4824 scope.go:117] "RemoveContainer" containerID="895f0c841b08d7b22272f1ad5924dc1a69f338d44d989cee670afa4ce10c5b44" Feb 24 00:31:36 crc kubenswrapper[4824]: E0224 00:31:36.230909 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw_service-telemetry(99b264a5-5103-4445-8978-942c71208377)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" podUID="99b264a5-5103-4445-8978-942c71208377" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.261867 4824 generic.go:334] "Generic (PLEG): container finished" podID="19cb1d3e-5363-406a-a5f4-ecfe04edd347" containerID="eeddf94c39afeaf91d3aad6e7e7b037e372c6ac5b7c06815ce6a33281b134212" exitCode=0 Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.261955 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerDied","Data":"eeddf94c39afeaf91d3aad6e7e7b037e372c6ac5b7c06815ce6a33281b134212"} Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.263192 4824 scope.go:117] "RemoveContainer" containerID="eeddf94c39afeaf91d3aad6e7e7b037e372c6ac5b7c06815ce6a33281b134212" Feb 24 00:31:36 crc kubenswrapper[4824]: E0224 00:31:36.263599 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc_service-telemetry(19cb1d3e-5363-406a-a5f4-ecfe04edd347)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" podUID="19cb1d3e-5363-406a-a5f4-ecfe04edd347" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.713285 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.716228 4824 scope.go:117] "RemoveContainer" containerID="4cf792676db082f4deec5a1cc01c8abbce90a283e92e03176c0a5e3f90a5aa8c" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.764333 4824 scope.go:117] "RemoveContainer" containerID="46e1822d839098966fca9d2fe9fd88f95716adc7230f55dbadcd96a250ec0b01" Feb 24 00:31:36 crc kubenswrapper[4824]: I0224 00:31:36.857256 4824 scope.go:117] "RemoveContainer" containerID="5d52767b4ff2eefa3a301cf99e13e98ce67da689c9800b168e08d2bdb25f9f50" Feb 24 00:31:37 crc kubenswrapper[4824]: I0224 00:31:37.275111 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-76s9x" event={"ID":"58055cab-656a-46f2-a3e6-ab76d8943362","Type":"ContainerStarted","Data":"1a6de1ba13a52bd2895d1002ff899b6d430bbd99bd0843e8f204e32a4359d3d3"} Feb 24 00:31:37 crc kubenswrapper[4824]: I0224 00:31:37.277204 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"ac546686-8945-46b8-8577-da344c7517bd","Type":"ContainerStarted","Data":"d8d4046d29485a031700a6ba696d966708022ce68163fa7c4c11586b2c58bfed"} Feb 24 00:31:37 crc kubenswrapper[4824]: I0224 00:31:37.294547 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerStarted","Data":"deeef59fe5eb3f8fdeb082ee224fb29bc82dce17d1d820fb64c7e338965f4541"} Feb 24 00:31:37 crc kubenswrapper[4824]: I0224 00:31:37.308110 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-76s9x" podStartSLOduration=4.308087113 podStartE2EDuration="4.308087113s" podCreationTimestamp="2026-02-24 00:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 00:31:37.300080024 +0000 UTC m=+1561.289704513" watchObservedRunningTime="2026-02-24 00:31:37.308087113 +0000 UTC m=+1561.297711582" Feb 24 00:31:38 crc kubenswrapper[4824]: I0224 00:31:38.329285 4824 generic.go:334] "Generic (PLEG): container finished" podID="77c39bbc-adcc-40f9-afe2-9d97f93262b9" containerID="deeef59fe5eb3f8fdeb082ee224fb29bc82dce17d1d820fb64c7e338965f4541" exitCode=0 Feb 24 00:31:38 crc kubenswrapper[4824]: I0224 00:31:38.329365 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerDied","Data":"deeef59fe5eb3f8fdeb082ee224fb29bc82dce17d1d820fb64c7e338965f4541"} Feb 24 00:31:38 crc kubenswrapper[4824]: I0224 00:31:38.329889 4824 scope.go:117] "RemoveContainer" containerID="c4a8fb8018686ea387d812e3112b01226a43b249311b6283fbdc21bb7472950f" Feb 24 00:31:38 crc kubenswrapper[4824]: I0224 00:31:38.330598 4824 scope.go:117] "RemoveContainer" containerID="deeef59fe5eb3f8fdeb082ee224fb29bc82dce17d1d820fb64c7e338965f4541" Feb 24 00:31:38 crc kubenswrapper[4824]: E0224 00:31:38.330991 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt_service-telemetry(77c39bbc-adcc-40f9-afe2-9d97f93262b9)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" podUID="77c39bbc-adcc-40f9-afe2-9d97f93262b9" Feb 24 00:31:46 crc kubenswrapper[4824]: I0224 00:31:46.699436 4824 scope.go:117] "RemoveContainer" containerID="580e4473703a9287d77dfa0282262b828fdf928feb45462d2929a259ad5db38e" Feb 24 00:31:48 crc kubenswrapper[4824]: I0224 00:31:48.694187 4824 scope.go:117] "RemoveContainer" containerID="9daa99339d0f6cbad236603de0bc2f34e81e039ff7ee664ab10491e20bc297e0" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.469715 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw" event={"ID":"b1b6fe19-ad2f-490e-80dc-39ed80de85b3","Type":"ContainerStarted","Data":"277da33123e68f7a326a3d53ded52b8c20c60ba9b74e4238248092cc640aa3da"} Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.472689 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"ac546686-8945-46b8-8577-da344c7517bd","Type":"ContainerStarted","Data":"4cd0924f21aeb1557ee2b24022a4dfe30dc99b54c6eb87da4165dc54e1ec62cd"} Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.475430 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84" event={"ID":"25d3b43f-0bff-44ca-83f4-b8a0052cd764","Type":"ContainerStarted","Data":"04a94944d394727f386664b963d2693378e6f68d00eb04b32282f1332c648a92"} Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.535847 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.945751584 podStartE2EDuration="14.535811923s" podCreationTimestamp="2026-02-24 00:31:35 +0000 UTC" firstStartedPulling="2026-02-24 00:31:36.743013235 +0000 UTC m=+1560.732637704" lastFinishedPulling="2026-02-24 00:31:48.333073574 +0000 UTC m=+1572.322698043" observedRunningTime="2026-02-24 00:31:49.513902779 +0000 UTC m=+1573.503527268" watchObservedRunningTime="2026-02-24 00:31:49.535811923 +0000 UTC m=+1573.525436392" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.694002 4824 scope.go:117] "RemoveContainer" containerID="eeddf94c39afeaf91d3aad6e7e7b037e372c6ac5b7c06815ce6a33281b134212" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.763175 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-bgdgk"] Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.764639 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.767559 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.767844 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.767978 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.768067 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.774367 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.774367 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.789316 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-bgdgk"] Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811211 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4kgg\" (UniqueName: \"kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811299 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811342 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811390 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811465 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811533 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.811602 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913453 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913537 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4kgg\" (UniqueName: \"kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913582 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913617 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913652 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913698 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.913727 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.914933 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.915485 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.916450 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.917097 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.929728 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.930043 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:49 crc kubenswrapper[4824]: I0224 00:31:49.942374 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4kgg\" (UniqueName: \"kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg\") pod \"stf-smoketest-smoke1-bgdgk\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.073128 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.074607 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.080506 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.080725 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.121188 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jbw\" (UniqueName: \"kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw\") pod \"curl\" (UID: \"c03665a8-62b6-481d-a01b-4ce2932d9abb\") " pod="service-telemetry/curl" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.224029 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jbw\" (UniqueName: \"kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw\") pod \"curl\" (UID: \"c03665a8-62b6-481d-a01b-4ce2932d9abb\") " pod="service-telemetry/curl" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.251292 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jbw\" (UniqueName: \"kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw\") pod \"curl\" (UID: \"c03665a8-62b6-481d-a01b-4ce2932d9abb\") " pod="service-telemetry/curl" Feb 24 00:31:50 crc kubenswrapper[4824]: W0224 00:31:50.478095 4824 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b209249_9fc7_4266_9089_9a228d1be14a.slice/crio-508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef WatchSource:0}: Error finding container 508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef: Status 404 returned error can't find the container with id 508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.488207 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-bgdgk"] Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.490969 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc" event={"ID":"19cb1d3e-5363-406a-a5f4-ecfe04edd347","Type":"ContainerStarted","Data":"46bfc54547b0d47a2ad1ebff3e2b4886bc99743c52a2ef8de6022717d2214c15"} Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.503903 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerStarted","Data":"508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef"} Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.507369 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.694288 4824 scope.go:117] "RemoveContainer" containerID="895f0c841b08d7b22272f1ad5924dc1a69f338d44d989cee670afa4ce10c5b44" Feb 24 00:31:50 crc kubenswrapper[4824]: I0224 00:31:50.759760 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 24 00:31:51 crc kubenswrapper[4824]: I0224 00:31:51.513536 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"c03665a8-62b6-481d-a01b-4ce2932d9abb","Type":"ContainerStarted","Data":"c74452911c8393037ee145106df2c4df90e84349d452289637be6905cc800c2b"} Feb 24 00:31:51 crc kubenswrapper[4824]: I0224 00:31:51.517075 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw" event={"ID":"99b264a5-5103-4445-8978-942c71208377","Type":"ContainerStarted","Data":"92cf2fcd4ded94a1c20ae4699ffcee082dda992896e77f610d94dda5872087eb"} Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.275993 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.276534 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.276603 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.277489 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.277560 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" gracePeriod=600 Feb 24 00:31:53 crc kubenswrapper[4824]: E0224 00:31:53.412223 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.539396 4824 generic.go:334] "Generic (PLEG): container finished" podID="c03665a8-62b6-481d-a01b-4ce2932d9abb" containerID="a066438f83b79f6d2d2cc0502f171ad43237becee4855645e99d4cfdc391dae2" exitCode=0 Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.539493 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"c03665a8-62b6-481d-a01b-4ce2932d9abb","Type":"ContainerDied","Data":"a066438f83b79f6d2d2cc0502f171ad43237becee4855645e99d4cfdc391dae2"} Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.547786 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" exitCode=0 Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.547873 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364"} Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.547944 4824 scope.go:117] "RemoveContainer" containerID="8d819df51f5c54106cb947aecba467be6a7835d606611afb3e6526ac4d026f80" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.548938 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:31:53 crc kubenswrapper[4824]: E0224 00:31:53.549301 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:31:53 crc kubenswrapper[4824]: I0224 00:31:53.695841 4824 scope.go:117] "RemoveContainer" containerID="deeef59fe5eb3f8fdeb082ee224fb29bc82dce17d1d820fb64c7e338965f4541" Feb 24 00:31:54 crc kubenswrapper[4824]: I0224 00:31:54.560786 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt" event={"ID":"77c39bbc-adcc-40f9-afe2-9d97f93262b9","Type":"ContainerStarted","Data":"7c4b2444fb7012361af857e8af3e0676908edaf842b85ee8dbff426ef5a1d070"} Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.039286 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.203200 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_c03665a8-62b6-481d-a01b-4ce2932d9abb/curl/0.log" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.233012 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84jbw\" (UniqueName: \"kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw\") pod \"c03665a8-62b6-481d-a01b-4ce2932d9abb\" (UID: \"c03665a8-62b6-481d-a01b-4ce2932d9abb\") " Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.276854 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw" (OuterVolumeSpecName: "kube-api-access-84jbw") pod "c03665a8-62b6-481d-a01b-4ce2932d9abb" (UID: "c03665a8-62b6-481d-a01b-4ce2932d9abb"). InnerVolumeSpecName "kube-api-access-84jbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.334862 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84jbw\" (UniqueName: \"kubernetes.io/projected/c03665a8-62b6-481d-a01b-4ce2932d9abb-kube-api-access-84jbw\") on node \"crc\" DevicePath \"\"" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.495131 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zxg6n_13d35d6f-04c4-438a-bda9-ce9c4ed84b99/prometheus-webhook-snmp/0.log" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.572562 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"c03665a8-62b6-481d-a01b-4ce2932d9abb","Type":"ContainerDied","Data":"c74452911c8393037ee145106df2c4df90e84349d452289637be6905cc800c2b"} Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.572611 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74452911c8393037ee145106df2c4df90e84349d452289637be6905cc800c2b" Feb 24 00:31:55 crc kubenswrapper[4824]: I0224 00:31:55.572692 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 24 00:32:04 crc kubenswrapper[4824]: I0224 00:32:04.648025 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerStarted","Data":"ee868c29c29041bfc7c890bf03233d4fb588940a093db29326b9e176fdb4a0f4"} Feb 24 00:32:05 crc kubenswrapper[4824]: I0224 00:32:05.694545 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:32:05 crc kubenswrapper[4824]: E0224 00:32:05.695471 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:32:15 crc kubenswrapper[4824]: I0224 00:32:15.754594 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerStarted","Data":"e39c45acb66eaf25d0796fbc6e40a1a5197765b7c4699af74ae2a62987aca171"} Feb 24 00:32:15 crc kubenswrapper[4824]: I0224 00:32:15.779579 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" podStartSLOduration=2.717686988 podStartE2EDuration="26.779561478s" podCreationTimestamp="2026-02-24 00:31:49 +0000 UTC" firstStartedPulling="2026-02-24 00:31:50.481197449 +0000 UTC m=+1574.470821918" lastFinishedPulling="2026-02-24 00:32:14.543071939 +0000 UTC m=+1598.532696408" observedRunningTime="2026-02-24 00:32:15.773966899 +0000 UTC m=+1599.763591488" watchObservedRunningTime="2026-02-24 00:32:15.779561478 +0000 UTC m=+1599.769185947" Feb 24 00:32:19 crc kubenswrapper[4824]: I0224 00:32:19.693823 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:32:19 crc kubenswrapper[4824]: E0224 00:32:19.694911 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:32:25 crc kubenswrapper[4824]: I0224 00:32:25.662475 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zxg6n_13d35d6f-04c4-438a-bda9-ce9c4ed84b99/prometheus-webhook-snmp/0.log" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.799569 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:30 crc kubenswrapper[4824]: E0224 00:32:30.800700 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03665a8-62b6-481d-a01b-4ce2932d9abb" containerName="curl" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.800716 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03665a8-62b6-481d-a01b-4ce2932d9abb" containerName="curl" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.800871 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03665a8-62b6-481d-a01b-4ce2932d9abb" containerName="curl" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.802763 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.825401 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.999111 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:30 crc kubenswrapper[4824]: I0224 00:32:30.999178 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:30.999229 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4q9\" (UniqueName: \"kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.101753 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.102339 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.102375 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4q9\" (UniqueName: \"kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.102554 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.102975 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.134439 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4q9\" (UniqueName: \"kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9\") pod \"community-operators-lwznq\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.433828 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.700196 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:32:31 crc kubenswrapper[4824]: E0224 00:32:31.700966 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.742712 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:31 crc kubenswrapper[4824]: I0224 00:32:31.902505 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerStarted","Data":"a57b0ccc6fb7339c031ffe79a75a2452b49fe5975dbc37e0e6d1e4a1c43a1a3c"} Feb 24 00:32:32 crc kubenswrapper[4824]: I0224 00:32:32.912816 4824 generic.go:334] "Generic (PLEG): container finished" podID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerID="f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a" exitCode=0 Feb 24 00:32:32 crc kubenswrapper[4824]: I0224 00:32:32.912897 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerDied","Data":"f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a"} Feb 24 00:32:33 crc kubenswrapper[4824]: I0224 00:32:33.921926 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerStarted","Data":"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b"} Feb 24 00:32:34 crc kubenswrapper[4824]: I0224 00:32:34.932858 4824 generic.go:334] "Generic (PLEG): container finished" podID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerID="1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b" exitCode=0 Feb 24 00:32:34 crc kubenswrapper[4824]: I0224 00:32:34.932952 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerDied","Data":"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b"} Feb 24 00:32:35 crc kubenswrapper[4824]: I0224 00:32:35.942973 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerStarted","Data":"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209"} Feb 24 00:32:35 crc kubenswrapper[4824]: I0224 00:32:35.968795 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lwznq" podStartSLOduration=3.376302698 podStartE2EDuration="5.968773122s" podCreationTimestamp="2026-02-24 00:32:30 +0000 UTC" firstStartedPulling="2026-02-24 00:32:32.915381737 +0000 UTC m=+1616.905006206" lastFinishedPulling="2026-02-24 00:32:35.507852161 +0000 UTC m=+1619.497476630" observedRunningTime="2026-02-24 00:32:35.963204513 +0000 UTC m=+1619.952828982" watchObservedRunningTime="2026-02-24 00:32:35.968773122 +0000 UTC m=+1619.958397601" Feb 24 00:32:38 crc kubenswrapper[4824]: I0224 00:32:38.991209 4824 generic.go:334] "Generic (PLEG): container finished" podID="3b209249-9fc7-4266-9089-9a228d1be14a" containerID="ee868c29c29041bfc7c890bf03233d4fb588940a093db29326b9e176fdb4a0f4" exitCode=0 Feb 24 00:32:38 crc kubenswrapper[4824]: I0224 00:32:38.991282 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerDied","Data":"ee868c29c29041bfc7c890bf03233d4fb588940a093db29326b9e176fdb4a0f4"} Feb 24 00:32:38 crc kubenswrapper[4824]: I0224 00:32:38.992628 4824 scope.go:117] "RemoveContainer" containerID="ee868c29c29041bfc7c890bf03233d4fb588940a093db29326b9e176fdb4a0f4" Feb 24 00:32:41 crc kubenswrapper[4824]: I0224 00:32:41.434424 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:41 crc kubenswrapper[4824]: I0224 00:32:41.435639 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:41 crc kubenswrapper[4824]: I0224 00:32:41.477726 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:42 crc kubenswrapper[4824]: I0224 00:32:42.054833 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:42 crc kubenswrapper[4824]: I0224 00:32:42.114604 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:42 crc kubenswrapper[4824]: I0224 00:32:42.693593 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:32:42 crc kubenswrapper[4824]: E0224 00:32:42.693871 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.033074 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lwznq" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="registry-server" containerID="cri-o://e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209" gracePeriod=2 Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.473230 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.549724 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf4q9\" (UniqueName: \"kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9\") pod \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.549875 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities\") pod \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.549928 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content\") pod \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\" (UID: \"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea\") " Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.551114 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities" (OuterVolumeSpecName: "utilities") pod "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" (UID: "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.556445 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9" (OuterVolumeSpecName: "kube-api-access-kf4q9") pod "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" (UID: "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea"). InnerVolumeSpecName "kube-api-access-kf4q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.612637 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" (UID: "2fc5db4c-2be5-4a26-8e98-c4fb482b1cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.652135 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.652188 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:44 crc kubenswrapper[4824]: I0224 00:32:44.652207 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf4q9\" (UniqueName: \"kubernetes.io/projected/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea-kube-api-access-kf4q9\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.044625 4824 generic.go:334] "Generic (PLEG): container finished" podID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerID="e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209" exitCode=0 Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.044670 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerDied","Data":"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209"} Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.044734 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lwznq" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.044765 4824 scope.go:117] "RemoveContainer" containerID="e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.044748 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lwznq" event={"ID":"2fc5db4c-2be5-4a26-8e98-c4fb482b1cea","Type":"ContainerDied","Data":"a57b0ccc6fb7339c031ffe79a75a2452b49fe5975dbc37e0e6d1e4a1c43a1a3c"} Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.069949 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.073906 4824 scope.go:117] "RemoveContainer" containerID="1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.078592 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lwznq"] Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.099258 4824 scope.go:117] "RemoveContainer" containerID="f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.130270 4824 scope.go:117] "RemoveContainer" containerID="e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209" Feb 24 00:32:45 crc kubenswrapper[4824]: E0224 00:32:45.132112 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209\": container with ID starting with e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209 not found: ID does not exist" containerID="e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.132165 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209"} err="failed to get container status \"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209\": rpc error: code = NotFound desc = could not find container \"e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209\": container with ID starting with e6fdead5bb2143b4d1e75d494c8e4c428dc0fa6bb4c070ad94edff3cba2dd209 not found: ID does not exist" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.132204 4824 scope.go:117] "RemoveContainer" containerID="1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b" Feb 24 00:32:45 crc kubenswrapper[4824]: E0224 00:32:45.132888 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b\": container with ID starting with 1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b not found: ID does not exist" containerID="1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.132947 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b"} err="failed to get container status \"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b\": rpc error: code = NotFound desc = could not find container \"1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b\": container with ID starting with 1105fc0199081feabbc77e0aa8649e4fc6931a85dd0bd0a288d174bf9d62190b not found: ID does not exist" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.132994 4824 scope.go:117] "RemoveContainer" containerID="f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a" Feb 24 00:32:45 crc kubenswrapper[4824]: E0224 00:32:45.133493 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a\": container with ID starting with f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a not found: ID does not exist" containerID="f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a" Feb 24 00:32:45 crc kubenswrapper[4824]: I0224 00:32:45.133544 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a"} err="failed to get container status \"f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a\": rpc error: code = NotFound desc = could not find container \"f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a\": container with ID starting with f1791b119fda8a76356ec44d4cb941205460e154a17e6bf25478d2d193834b3a not found: ID does not exist" Feb 24 00:32:46 crc kubenswrapper[4824]: I0224 00:32:46.702713 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" path="/var/lib/kubelet/pods/2fc5db4c-2be5-4a26-8e98-c4fb482b1cea/volumes" Feb 24 00:32:47 crc kubenswrapper[4824]: I0224 00:32:47.066997 4824 generic.go:334] "Generic (PLEG): container finished" podID="3b209249-9fc7-4266-9089-9a228d1be14a" containerID="e39c45acb66eaf25d0796fbc6e40a1a5197765b7c4699af74ae2a62987aca171" exitCode=0 Feb 24 00:32:47 crc kubenswrapper[4824]: I0224 00:32:47.067065 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerDied","Data":"e39c45acb66eaf25d0796fbc6e40a1a5197765b7c4699af74ae2a62987aca171"} Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.339101 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421430 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421544 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4kgg\" (UniqueName: \"kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421647 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421792 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421814 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421873 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.421897 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.430109 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg" (OuterVolumeSpecName: "kube-api-access-s4kgg") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "kube-api-access-s4kgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.444071 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.444094 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.450848 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.454155 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: E0224 00:32:48.473551 4824 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script podName:3b209249-9fc7-4266-9089-9a228d1be14a nodeName:}" failed. No retries permitted until 2026-02-24 00:32:48.973469401 +0000 UTC m=+1632.963093870 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ceilometer-entrypoint-script" (UniqueName: "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a") : error deleting /var/lib/kubelet/pods/3b209249-9fc7-4266-9089-9a228d1be14a/volume-subpaths: remove /var/lib/kubelet/pods/3b209249-9fc7-4266-9089-9a228d1be14a/volume-subpaths: no such file or directory Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.473804 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.524935 4824 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-healthcheck-log\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.524982 4824 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-sensubility-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.524993 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4kgg\" (UniqueName: \"kubernetes.io/projected/3b209249-9fc7-4266-9089-9a228d1be14a-kube-api-access-s4kgg\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.525004 4824 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-config\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.525013 4824 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:48 crc kubenswrapper[4824]: I0224 00:32:48.525022 4824 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.033262 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") pod \"3b209249-9fc7-4266-9089-9a228d1be14a\" (UID: \"3b209249-9fc7-4266-9089-9a228d1be14a\") " Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.033782 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "3b209249-9fc7-4266-9089-9a228d1be14a" (UID: "3b209249-9fc7-4266-9089-9a228d1be14a"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.034209 4824 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/3b209249-9fc7-4266-9089-9a228d1be14a-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.086778 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" event={"ID":"3b209249-9fc7-4266-9089-9a228d1be14a","Type":"ContainerDied","Data":"508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef"} Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.086841 4824 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508749b5e1ef331b99876faa1b1b095b9f3d7ad6af9e6509ba2456478ab33bef" Feb 24 00:32:49 crc kubenswrapper[4824]: I0224 00:32:49.087330 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-bgdgk" Feb 24 00:32:50 crc kubenswrapper[4824]: I0224 00:32:50.313030 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-bgdgk_3b209249-9fc7-4266-9089-9a228d1be14a/smoketest-collectd/0.log" Feb 24 00:32:50 crc kubenswrapper[4824]: I0224 00:32:50.537908 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-bgdgk_3b209249-9fc7-4266-9089-9a228d1be14a/smoketest-ceilometer/0.log" Feb 24 00:32:50 crc kubenswrapper[4824]: I0224 00:32:50.820394 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-76s9x_58055cab-656a-46f2-a3e6-ab76d8943362/default-interconnect/0.log" Feb 24 00:32:51 crc kubenswrapper[4824]: I0224 00:32:51.074730 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw_b1b6fe19-ad2f-490e-80dc-39ed80de85b3/bridge/2.log" Feb 24 00:32:51 crc kubenswrapper[4824]: I0224 00:32:51.300793 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-68kfw_b1b6fe19-ad2f-490e-80dc-39ed80de85b3/sg-core/0.log" Feb 24 00:32:51 crc kubenswrapper[4824]: I0224 00:32:51.524635 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt_77c39bbc-adcc-40f9-afe2-9d97f93262b9/bridge/2.log" Feb 24 00:32:51 crc kubenswrapper[4824]: I0224 00:32:51.784254 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7bcfc4c8bf-2d6bt_77c39bbc-adcc-40f9-afe2-9d97f93262b9/sg-core/0.log" Feb 24 00:32:52 crc kubenswrapper[4824]: I0224 00:32:52.060258 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw_99b264a5-5103-4445-8978-942c71208377/bridge/2.log" Feb 24 00:32:52 crc kubenswrapper[4824]: I0224 00:32:52.322445 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-kmkbw_99b264a5-5103-4445-8978-942c71208377/sg-core/0.log" Feb 24 00:32:52 crc kubenswrapper[4824]: I0224 00:32:52.553023 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc_19cb1d3e-5363-406a-a5f4-ecfe04edd347/bridge/2.log" Feb 24 00:32:52 crc kubenswrapper[4824]: I0224 00:32:52.781007 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-6cd878c86f-p97zc_19cb1d3e-5363-406a-a5f4-ecfe04edd347/sg-core/0.log" Feb 24 00:32:52 crc kubenswrapper[4824]: I0224 00:32:52.997896 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84_25d3b43f-0bff-44ca-83f4-b8a0052cd764/bridge/2.log" Feb 24 00:32:53 crc kubenswrapper[4824]: I0224 00:32:53.296156 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5jd84_25d3b43f-0bff-44ca-83f4-b8a0052cd764/sg-core/0.log" Feb 24 00:32:57 crc kubenswrapper[4824]: I0224 00:32:57.140196 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-755b8777c-j59cx_99d102db-b6a5-428f-acec-1311a225325d/operator/0.log" Feb 24 00:32:57 crc kubenswrapper[4824]: I0224 00:32:57.386452 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_d1d48ccf-0bde-4748-8128-1e82ca1f302a/prometheus/0.log" Feb 24 00:32:57 crc kubenswrapper[4824]: I0224 00:32:57.631499 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_96f9c835-f7c9-4774-9b95-8911ab4ffb23/elasticsearch/0.log" Feb 24 00:32:57 crc kubenswrapper[4824]: I0224 00:32:57.694213 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:32:57 crc kubenswrapper[4824]: E0224 00:32:57.694547 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:32:57 crc kubenswrapper[4824]: I0224 00:32:57.887894 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zxg6n_13d35d6f-04c4-438a-bda9-ce9c4ed84b99/prometheus-webhook-snmp/0.log" Feb 24 00:32:58 crc kubenswrapper[4824]: I0224 00:32:58.172587 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_b4916ffb-2e83-480a-a12f-ad04c6144517/alertmanager/0.log" Feb 24 00:33:12 crc kubenswrapper[4824]: I0224 00:33:12.186832 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-7f7c584b79-2rbxz_3394aaea-7658-498b-aab1-7494fb832c8f/operator/0.log" Feb 24 00:33:12 crc kubenswrapper[4824]: I0224 00:33:12.693488 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:33:12 crc kubenswrapper[4824]: E0224 00:33:12.693913 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:33:15 crc kubenswrapper[4824]: I0224 00:33:15.853593 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-755b8777c-j59cx_99d102db-b6a5-428f-acec-1311a225325d/operator/0.log" Feb 24 00:33:16 crc kubenswrapper[4824]: I0224 00:33:16.112253 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_ac546686-8945-46b8-8577-da344c7517bd/qdr/0.log" Feb 24 00:33:24 crc kubenswrapper[4824]: I0224 00:33:24.694386 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:33:24 crc kubenswrapper[4824]: E0224 00:33:24.695956 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:33:35 crc kubenswrapper[4824]: I0224 00:33:35.694185 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:33:35 crc kubenswrapper[4824]: E0224 00:33:35.695920 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:33:46 crc kubenswrapper[4824]: I0224 00:33:46.703044 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:33:46 crc kubenswrapper[4824]: E0224 00:33:46.703986 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.452849 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v4984/must-gather-t5kpz"] Feb 24 00:33:49 crc kubenswrapper[4824]: E0224 00:33:49.453911 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-collectd" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.453929 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-collectd" Feb 24 00:33:49 crc kubenswrapper[4824]: E0224 00:33:49.453950 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="registry-server" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.453959 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="registry-server" Feb 24 00:33:49 crc kubenswrapper[4824]: E0224 00:33:49.453975 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="extract-utilities" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.453984 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="extract-utilities" Feb 24 00:33:49 crc kubenswrapper[4824]: E0224 00:33:49.454010 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="extract-content" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.454018 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="extract-content" Feb 24 00:33:49 crc kubenswrapper[4824]: E0224 00:33:49.454030 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-ceilometer" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.454038 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-ceilometer" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.454211 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-ceilometer" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.454223 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc5db4c-2be5-4a26-8e98-c4fb482b1cea" containerName="registry-server" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.454235 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b209249-9fc7-4266-9089-9a228d1be14a" containerName="smoketest-collectd" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.455236 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.461037 4824 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v4984"/"default-dockercfg-mp758" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.461395 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v4984"/"openshift-service-ca.crt" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.466957 4824 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v4984"/"kube-root-ca.crt" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.482342 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v4984/must-gather-t5kpz"] Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.516958 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w677s\" (UniqueName: \"kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.517010 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.618869 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w677s\" (UniqueName: \"kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.618930 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.619565 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.649423 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w677s\" (UniqueName: \"kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s\") pod \"must-gather-t5kpz\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:49 crc kubenswrapper[4824]: I0224 00:33:49.790377 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:33:50 crc kubenswrapper[4824]: I0224 00:33:50.075598 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v4984/must-gather-t5kpz"] Feb 24 00:33:50 crc kubenswrapper[4824]: I0224 00:33:50.582608 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4984/must-gather-t5kpz" event={"ID":"30551e99-c0dc-480e-8aa4-cfb7df233fa5","Type":"ContainerStarted","Data":"6b1ff4420b968d45931a50a401488cf9d06db869860ffded055aa82b666bf6cf"} Feb 24 00:33:58 crc kubenswrapper[4824]: I0224 00:33:58.695088 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:33:58 crc kubenswrapper[4824]: E0224 00:33:58.696426 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:33:59 crc kubenswrapper[4824]: I0224 00:33:59.672572 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4984/must-gather-t5kpz" event={"ID":"30551e99-c0dc-480e-8aa4-cfb7df233fa5","Type":"ContainerStarted","Data":"c82f94371919f040f11f9a25c3e88f21683ea931dd570e2a2fd2964f6a6b29a8"} Feb 24 00:33:59 crc kubenswrapper[4824]: I0224 00:33:59.673016 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4984/must-gather-t5kpz" event={"ID":"30551e99-c0dc-480e-8aa4-cfb7df233fa5","Type":"ContainerStarted","Data":"be48c11ddee87f6376cb573bf02565d2ce45f4f7484d835a87096a80c130fad9"} Feb 24 00:33:59 crc kubenswrapper[4824]: I0224 00:33:59.698196 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v4984/must-gather-t5kpz" podStartSLOduration=2.259803591 podStartE2EDuration="10.698171256s" podCreationTimestamp="2026-02-24 00:33:49 +0000 UTC" firstStartedPulling="2026-02-24 00:33:50.08676884 +0000 UTC m=+1694.076393309" lastFinishedPulling="2026-02-24 00:33:58.525136505 +0000 UTC m=+1702.514760974" observedRunningTime="2026-02-24 00:33:59.69357131 +0000 UTC m=+1703.683195779" watchObservedRunningTime="2026-02-24 00:33:59.698171256 +0000 UTC m=+1703.687795725" Feb 24 00:34:12 crc kubenswrapper[4824]: I0224 00:34:12.693430 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:34:12 crc kubenswrapper[4824]: E0224 00:34:12.694422 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:34:27 crc kubenswrapper[4824]: I0224 00:34:27.694320 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:34:27 crc kubenswrapper[4824]: E0224 00:34:27.695540 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:34:39 crc kubenswrapper[4824]: I0224 00:34:39.694885 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:34:39 crc kubenswrapper[4824]: E0224 00:34:39.696003 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:34:44 crc kubenswrapper[4824]: I0224 00:34:44.972796 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-q8hvw_13bff804-f118-473b-a547-433aed671b46/control-plane-machine-set-operator/0.log" Feb 24 00:34:45 crc kubenswrapper[4824]: I0224 00:34:45.178115 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kh6hg_53344821-2f26-459a-9e42-003f3f1b5a87/kube-rbac-proxy/0.log" Feb 24 00:34:45 crc kubenswrapper[4824]: I0224 00:34:45.188795 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kh6hg_53344821-2f26-459a-9e42-003f3f1b5a87/machine-api-operator/0.log" Feb 24 00:34:54 crc kubenswrapper[4824]: I0224 00:34:54.694234 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:34:54 crc kubenswrapper[4824]: E0224 00:34:54.695346 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:34:58 crc kubenswrapper[4824]: I0224 00:34:58.377578 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-9mpql_1f370348-c40e-4096-98c1-d681f34b8659/cert-manager-controller/0.log" Feb 24 00:34:58 crc kubenswrapper[4824]: I0224 00:34:58.547745 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-qhzcr_219daf0d-f400-4a2c-8374-5c23e10c27a6/cert-manager-cainjector/0.log" Feb 24 00:34:58 crc kubenswrapper[4824]: I0224 00:34:58.557403 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-m8rqb_ad293038-bf1d-4800-bd32-9488c5f19e95/cert-manager-webhook/0.log" Feb 24 00:35:05 crc kubenswrapper[4824]: I0224 00:35:05.694445 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:35:05 crc kubenswrapper[4824]: E0224 00:35:05.695665 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:35:14 crc kubenswrapper[4824]: I0224 00:35:14.783784 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-df47j_02a08fee-e933-4730-8755-7419c78d6525/prometheus-operator/0.log" Feb 24 00:35:14 crc kubenswrapper[4824]: I0224 00:35:14.918432 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s_7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a/prometheus-operator-admission-webhook/0.log" Feb 24 00:35:14 crc kubenswrapper[4824]: I0224 00:35:14.977036 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf_350461e1-7bfd-4095-9d74-4c3df3159694/prometheus-operator-admission-webhook/0.log" Feb 24 00:35:15 crc kubenswrapper[4824]: I0224 00:35:15.112566 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hhf7q_823099c2-9764-455a-a682-57c154c0d895/operator/0.log" Feb 24 00:35:15 crc kubenswrapper[4824]: I0224 00:35:15.198624 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-frbxc_885263fe-5a06-4089-b662-d3e4dbc7d08e/perses-operator/0.log" Feb 24 00:35:19 crc kubenswrapper[4824]: I0224 00:35:19.693806 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:35:19 crc kubenswrapper[4824]: E0224 00:35:19.694460 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.535247 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/util/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.781029 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/pull/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.800235 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/pull/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.834888 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/util/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.985009 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/util/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.985642 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/extract/0.log" Feb 24 00:35:29 crc kubenswrapper[4824]: I0224 00:35:29.993619 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f17dtpw_55bd419c-9f16-434a-9a7f-0693ab6601d4/pull/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.159365 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/util/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.387997 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/pull/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.388009 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/pull/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.417444 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/util/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.552864 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/util/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.570042 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/pull/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.578870 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fnmwrf_7191d6cb-0051-4cd2-a93d-a26af6142eb8/extract/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.767871 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/util/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.995228 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/util/0.log" Feb 24 00:35:30 crc kubenswrapper[4824]: I0224 00:35:30.999308 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.002007 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.170139 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/util/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.227670 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.232761 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jcmlq_379ee973-5632-434f-953c-7f23d7dc8f9d/extract/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.369891 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/util/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.553392 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/util/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.603109 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.608361 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.774015 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/pull/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.779470 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/util/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.788672 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08t2v92_5017dbd5-8ff7-4321-9ff3-db31cdbb5f8d/extract/0.log" Feb 24 00:35:31 crc kubenswrapper[4824]: I0224 00:35:31.967652 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.127178 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-content/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.132877 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.137886 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-content/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.325234 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.356061 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/extract-content/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.574349 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.681032 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9gq54_2d7ceac8-1cca-49dc-bff6-f6fa38cbfc1d/registry-server/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.769010 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-content/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.789969 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.800819 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-content/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.983169 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-utilities/0.log" Feb 24 00:35:32 crc kubenswrapper[4824]: I0224 00:35:32.992750 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/extract-content/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.231724 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jqdmp_1c407e9b-e49e-46a5-8920-786aad1539fb/marketplace-operator/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.297441 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9t5cw_9da3bd34-bc43-4c9d-a974-a131ad945913/registry-server/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.368737 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-utilities/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.475768 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-utilities/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.513806 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-content/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.536898 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-content/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.654128 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-utilities/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.676377 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/extract-content/0.log" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.693589 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:35:33 crc kubenswrapper[4824]: E0224 00:35:33.693835 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:35:33 crc kubenswrapper[4824]: I0224 00:35:33.924933 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2gh9t_ee751741-65c5-4db2-aa84-8c1e6868cf86/registry-server/0.log" Feb 24 00:35:46 crc kubenswrapper[4824]: I0224 00:35:46.444372 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-579b5ff79b-2qg2s_7c2534a7-fe82-47e6-b6f9-2bf928bf7c9a/prometheus-operator-admission-webhook/0.log" Feb 24 00:35:46 crc kubenswrapper[4824]: I0224 00:35:46.459540 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-df47j_02a08fee-e933-4730-8755-7419c78d6525/prometheus-operator/0.log" Feb 24 00:35:46 crc kubenswrapper[4824]: I0224 00:35:46.461424 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-579b5ff79b-nwsdf_350461e1-7bfd-4095-9d74-4c3df3159694/prometheus-operator-admission-webhook/0.log" Feb 24 00:35:46 crc kubenswrapper[4824]: I0224 00:35:46.619035 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-hhf7q_823099c2-9764-455a-a682-57c154c0d895/operator/0.log" Feb 24 00:35:46 crc kubenswrapper[4824]: I0224 00:35:46.632648 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-frbxc_885263fe-5a06-4089-b662-d3e4dbc7d08e/perses-operator/0.log" Feb 24 00:35:48 crc kubenswrapper[4824]: I0224 00:35:48.694018 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:35:48 crc kubenswrapper[4824]: E0224 00:35:48.694487 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:36:02 crc kubenswrapper[4824]: I0224 00:36:02.694907 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:36:02 crc kubenswrapper[4824]: E0224 00:36:02.696220 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:36:17 crc kubenswrapper[4824]: I0224 00:36:17.694137 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:36:17 crc kubenswrapper[4824]: E0224 00:36:17.695398 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:36:28 crc kubenswrapper[4824]: I0224 00:36:28.694488 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:36:28 crc kubenswrapper[4824]: E0224 00:36:28.695736 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:36:41 crc kubenswrapper[4824]: I0224 00:36:41.694051 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:36:41 crc kubenswrapper[4824]: E0224 00:36:41.695112 4824 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vcbgn_openshift-machine-config-operator(939ca085-9383-42e6-b7d6-37f101137273)\"" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" Feb 24 00:36:46 crc kubenswrapper[4824]: I0224 00:36:46.106163 4824 generic.go:334] "Generic (PLEG): container finished" podID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerID="be48c11ddee87f6376cb573bf02565d2ce45f4f7484d835a87096a80c130fad9" exitCode=0 Feb 24 00:36:46 crc kubenswrapper[4824]: I0224 00:36:46.106460 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v4984/must-gather-t5kpz" event={"ID":"30551e99-c0dc-480e-8aa4-cfb7df233fa5","Type":"ContainerDied","Data":"be48c11ddee87f6376cb573bf02565d2ce45f4f7484d835a87096a80c130fad9"} Feb 24 00:36:46 crc kubenswrapper[4824]: I0224 00:36:46.107446 4824 scope.go:117] "RemoveContainer" containerID="be48c11ddee87f6376cb573bf02565d2ce45f4f7484d835a87096a80c130fad9" Feb 24 00:36:46 crc kubenswrapper[4824]: I0224 00:36:46.829775 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4984_must-gather-t5kpz_30551e99-c0dc-480e-8aa4-cfb7df233fa5/gather/0.log" Feb 24 00:36:53 crc kubenswrapper[4824]: I0224 00:36:53.694022 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" Feb 24 00:36:53 crc kubenswrapper[4824]: I0224 00:36:53.883021 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v4984/must-gather-t5kpz"] Feb 24 00:36:53 crc kubenswrapper[4824]: I0224 00:36:53.883800 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v4984/must-gather-t5kpz" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="copy" containerID="cri-o://c82f94371919f040f11f9a25c3e88f21683ea931dd570e2a2fd2964f6a6b29a8" gracePeriod=2 Feb 24 00:36:53 crc kubenswrapper[4824]: I0224 00:36:53.890474 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v4984/must-gather-t5kpz"] Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.186132 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"2d6f8ce5501722862dc8ed78387b85d7725f9ecfe5b1eca0592c5b8a2bb70509"} Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.188631 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4984_must-gather-t5kpz_30551e99-c0dc-480e-8aa4-cfb7df233fa5/copy/0.log" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.189065 4824 generic.go:334] "Generic (PLEG): container finished" podID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerID="c82f94371919f040f11f9a25c3e88f21683ea931dd570e2a2fd2964f6a6b29a8" exitCode=143 Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.288107 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4984_must-gather-t5kpz_30551e99-c0dc-480e-8aa4-cfb7df233fa5/copy/0.log" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.288613 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.394365 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w677s\" (UniqueName: \"kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s\") pod \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.394474 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output\") pod \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\" (UID: \"30551e99-c0dc-480e-8aa4-cfb7df233fa5\") " Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.406850 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s" (OuterVolumeSpecName: "kube-api-access-w677s") pod "30551e99-c0dc-480e-8aa4-cfb7df233fa5" (UID: "30551e99-c0dc-480e-8aa4-cfb7df233fa5"). InnerVolumeSpecName "kube-api-access-w677s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.451813 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "30551e99-c0dc-480e-8aa4-cfb7df233fa5" (UID: "30551e99-c0dc-480e-8aa4-cfb7df233fa5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.495807 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w677s\" (UniqueName: \"kubernetes.io/projected/30551e99-c0dc-480e-8aa4-cfb7df233fa5-kube-api-access-w677s\") on node \"crc\" DevicePath \"\"" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.496061 4824 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30551e99-c0dc-480e-8aa4-cfb7df233fa5-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 24 00:36:54 crc kubenswrapper[4824]: I0224 00:36:54.704205 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" path="/var/lib/kubelet/pods/30551e99-c0dc-480e-8aa4-cfb7df233fa5/volumes" Feb 24 00:36:55 crc kubenswrapper[4824]: I0224 00:36:55.197808 4824 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v4984_must-gather-t5kpz_30551e99-c0dc-480e-8aa4-cfb7df233fa5/copy/0.log" Feb 24 00:36:55 crc kubenswrapper[4824]: I0224 00:36:55.198443 4824 scope.go:117] "RemoveContainer" containerID="c82f94371919f040f11f9a25c3e88f21683ea931dd570e2a2fd2964f6a6b29a8" Feb 24 00:36:55 crc kubenswrapper[4824]: I0224 00:36:55.198564 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v4984/must-gather-t5kpz" Feb 24 00:36:55 crc kubenswrapper[4824]: I0224 00:36:55.223725 4824 scope.go:117] "RemoveContainer" containerID="be48c11ddee87f6376cb573bf02565d2ce45f4f7484d835a87096a80c130fad9" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.177898 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:13 crc kubenswrapper[4824]: E0224 00:37:13.178874 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="copy" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.178890 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="copy" Feb 24 00:37:13 crc kubenswrapper[4824]: E0224 00:37:13.178901 4824 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="gather" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.178907 4824 state_mem.go:107] "Deleted CPUSet assignment" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="gather" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.179025 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="copy" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.179038 4824 memory_manager.go:354] "RemoveStaleState removing state" podUID="30551e99-c0dc-480e-8aa4-cfb7df233fa5" containerName="gather" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.179969 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.197910 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.341555 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.341651 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.341679 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz22r\" (UniqueName: \"kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.443553 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.444086 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.444193 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz22r\" (UniqueName: \"kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.444341 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.444713 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.468139 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz22r\" (UniqueName: \"kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r\") pod \"certified-operators-dzb2d\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.510947 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:13 crc kubenswrapper[4824]: I0224 00:37:13.793998 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:14 crc kubenswrapper[4824]: I0224 00:37:14.363580 4824 generic.go:334] "Generic (PLEG): container finished" podID="c00f22c6-c1c6-448e-82da-b778d04a8c0f" containerID="5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db" exitCode=0 Feb 24 00:37:14 crc kubenswrapper[4824]: I0224 00:37:14.363699 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerDied","Data":"5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db"} Feb 24 00:37:14 crc kubenswrapper[4824]: I0224 00:37:14.364158 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerStarted","Data":"8e1dc3a83619ae5d02e4c6a899175d5a9f386c93b9b7c498c85725fdec53ea8b"} Feb 24 00:37:14 crc kubenswrapper[4824]: I0224 00:37:14.366719 4824 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.375153 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerStarted","Data":"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86"} Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.380827 4824 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.382737 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.400387 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.485055 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nvc\" (UniqueName: \"kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.485184 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.485239 4824 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.586400 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nvc\" (UniqueName: \"kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.586484 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.586547 4824 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.587120 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.587370 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.611287 4824 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nvc\" (UniqueName: \"kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc\") pod \"redhat-operators-plnc5\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.699603 4824 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:15 crc kubenswrapper[4824]: I0224 00:37:15.949417 4824 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:16 crc kubenswrapper[4824]: I0224 00:37:16.386030 4824 generic.go:334] "Generic (PLEG): container finished" podID="c00f22c6-c1c6-448e-82da-b778d04a8c0f" containerID="974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86" exitCode=0 Feb 24 00:37:16 crc kubenswrapper[4824]: I0224 00:37:16.386114 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerDied","Data":"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86"} Feb 24 00:37:16 crc kubenswrapper[4824]: I0224 00:37:16.388688 4824 generic.go:334] "Generic (PLEG): container finished" podID="9d1ec7bf-1088-4425-8d7e-36112806bc0b" containerID="d932575d23aee6270236e4e02e8eac17b663f548742031e262b07026c08e0e00" exitCode=0 Feb 24 00:37:16 crc kubenswrapper[4824]: I0224 00:37:16.388719 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerDied","Data":"d932575d23aee6270236e4e02e8eac17b663f548742031e262b07026c08e0e00"} Feb 24 00:37:16 crc kubenswrapper[4824]: I0224 00:37:16.388739 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerStarted","Data":"012fe057c0077558a9201df0455b125e8df328cdf606f15f72a5d15a80b93c69"} Feb 24 00:37:17 crc kubenswrapper[4824]: I0224 00:37:17.398136 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerStarted","Data":"f95f8b60bfb86d3d4d9d5f73769d2f2841e4de66351544100b23acaa7e1f0e20"} Feb 24 00:37:17 crc kubenswrapper[4824]: I0224 00:37:17.400573 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerStarted","Data":"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55"} Feb 24 00:37:17 crc kubenswrapper[4824]: I0224 00:37:17.452510 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzb2d" podStartSLOduration=1.9647066359999998 podStartE2EDuration="4.452486678s" podCreationTimestamp="2026-02-24 00:37:13 +0000 UTC" firstStartedPulling="2026-02-24 00:37:14.366297694 +0000 UTC m=+1898.355922173" lastFinishedPulling="2026-02-24 00:37:16.854077746 +0000 UTC m=+1900.843702215" observedRunningTime="2026-02-24 00:37:17.451054402 +0000 UTC m=+1901.440678891" watchObservedRunningTime="2026-02-24 00:37:17.452486678 +0000 UTC m=+1901.442111147" Feb 24 00:37:18 crc kubenswrapper[4824]: I0224 00:37:18.414355 4824 generic.go:334] "Generic (PLEG): container finished" podID="9d1ec7bf-1088-4425-8d7e-36112806bc0b" containerID="f95f8b60bfb86d3d4d9d5f73769d2f2841e4de66351544100b23acaa7e1f0e20" exitCode=0 Feb 24 00:37:18 crc kubenswrapper[4824]: I0224 00:37:18.414650 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerDied","Data":"f95f8b60bfb86d3d4d9d5f73769d2f2841e4de66351544100b23acaa7e1f0e20"} Feb 24 00:37:19 crc kubenswrapper[4824]: I0224 00:37:19.424511 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerStarted","Data":"25c0739d92e36a44edd0b5aaf9c2a2dda811274cb63c527e398c4da941023b85"} Feb 24 00:37:19 crc kubenswrapper[4824]: I0224 00:37:19.452475 4824 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-plnc5" podStartSLOduration=1.994357559 podStartE2EDuration="4.452447239s" podCreationTimestamp="2026-02-24 00:37:15 +0000 UTC" firstStartedPulling="2026-02-24 00:37:16.390113839 +0000 UTC m=+1900.379738308" lastFinishedPulling="2026-02-24 00:37:18.848203519 +0000 UTC m=+1902.837827988" observedRunningTime="2026-02-24 00:37:19.447289568 +0000 UTC m=+1903.436914057" watchObservedRunningTime="2026-02-24 00:37:19.452447239 +0000 UTC m=+1903.442071708" Feb 24 00:37:23 crc kubenswrapper[4824]: I0224 00:37:23.511887 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:23 crc kubenswrapper[4824]: I0224 00:37:23.513497 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:23 crc kubenswrapper[4824]: I0224 00:37:23.560500 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:24 crc kubenswrapper[4824]: I0224 00:37:24.513915 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:24 crc kubenswrapper[4824]: I0224 00:37:24.566902 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:25 crc kubenswrapper[4824]: I0224 00:37:25.700094 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:25 crc kubenswrapper[4824]: I0224 00:37:25.700580 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:25 crc kubenswrapper[4824]: I0224 00:37:25.803753 4824 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:26 crc kubenswrapper[4824]: I0224 00:37:26.483489 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzb2d" podUID="c00f22c6-c1c6-448e-82da-b778d04a8c0f" containerName="registry-server" containerID="cri-o://788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55" gracePeriod=2 Feb 24 00:37:26 crc kubenswrapper[4824]: I0224 00:37:26.531702 4824 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:26 crc kubenswrapper[4824]: I0224 00:37:26.968134 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:27 crc kubenswrapper[4824]: I0224 00:37:27.955021 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.111747 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz22r\" (UniqueName: \"kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r\") pod \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.112400 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities\") pod \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.113321 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities" (OuterVolumeSpecName: "utilities") pod "c00f22c6-c1c6-448e-82da-b778d04a8c0f" (UID: "c00f22c6-c1c6-448e-82da-b778d04a8c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.113475 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content\") pod \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\" (UID: \"c00f22c6-c1c6-448e-82da-b778d04a8c0f\") " Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.124609 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r" (OuterVolumeSpecName: "kube-api-access-dz22r") pod "c00f22c6-c1c6-448e-82da-b778d04a8c0f" (UID: "c00f22c6-c1c6-448e-82da-b778d04a8c0f"). InnerVolumeSpecName "kube-api-access-dz22r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.133429 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz22r\" (UniqueName: \"kubernetes.io/projected/c00f22c6-c1c6-448e-82da-b778d04a8c0f-kube-api-access-dz22r\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.133478 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.187303 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c00f22c6-c1c6-448e-82da-b778d04a8c0f" (UID: "c00f22c6-c1c6-448e-82da-b778d04a8c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.234602 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c00f22c6-c1c6-448e-82da-b778d04a8c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.501879 4824 generic.go:334] "Generic (PLEG): container finished" podID="c00f22c6-c1c6-448e-82da-b778d04a8c0f" containerID="788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55" exitCode=0 Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.501965 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzb2d" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.501979 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerDied","Data":"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55"} Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.502021 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzb2d" event={"ID":"c00f22c6-c1c6-448e-82da-b778d04a8c0f","Type":"ContainerDied","Data":"8e1dc3a83619ae5d02e4c6a899175d5a9f386c93b9b7c498c85725fdec53ea8b"} Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.502040 4824 scope.go:117] "RemoveContainer" containerID="788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.502378 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-plnc5" podUID="9d1ec7bf-1088-4425-8d7e-36112806bc0b" containerName="registry-server" containerID="cri-o://25c0739d92e36a44edd0b5aaf9c2a2dda811274cb63c527e398c4da941023b85" gracePeriod=2 Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.557824 4824 scope.go:117] "RemoveContainer" containerID="974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.568587 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.579392 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzb2d"] Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.590897 4824 scope.go:117] "RemoveContainer" containerID="5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.623911 4824 scope.go:117] "RemoveContainer" containerID="788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55" Feb 24 00:37:28 crc kubenswrapper[4824]: E0224 00:37:28.624464 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55\": container with ID starting with 788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55 not found: ID does not exist" containerID="788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.624585 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55"} err="failed to get container status \"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55\": rpc error: code = NotFound desc = could not find container \"788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55\": container with ID starting with 788a443efb38ec6ffead6b4c2bd4f5be9be2cbe38e8523898e0a39aa4e318f55 not found: ID does not exist" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.624624 4824 scope.go:117] "RemoveContainer" containerID="974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86" Feb 24 00:37:28 crc kubenswrapper[4824]: E0224 00:37:28.625184 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86\": container with ID starting with 974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86 not found: ID does not exist" containerID="974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.625212 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86"} err="failed to get container status \"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86\": rpc error: code = NotFound desc = could not find container \"974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86\": container with ID starting with 974c7c06ffea10da3a5f8cd31e0d72834e82e9bfc1f63954f9c6d3d2b8c8db86 not found: ID does not exist" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.625230 4824 scope.go:117] "RemoveContainer" containerID="5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db" Feb 24 00:37:28 crc kubenswrapper[4824]: E0224 00:37:28.628535 4824 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db\": container with ID starting with 5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db not found: ID does not exist" containerID="5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.628570 4824 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db"} err="failed to get container status \"5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db\": rpc error: code = NotFound desc = could not find container \"5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db\": container with ID starting with 5e0ce036e3c72b781ea3b560ad5c6fd9d6bea62d8dd93a6cf8eb9b569edc50db not found: ID does not exist" Feb 24 00:37:28 crc kubenswrapper[4824]: I0224 00:37:28.703588 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00f22c6-c1c6-448e-82da-b778d04a8c0f" path="/var/lib/kubelet/pods/c00f22c6-c1c6-448e-82da-b778d04a8c0f/volumes" Feb 24 00:37:30 crc kubenswrapper[4824]: I0224 00:37:30.527419 4824 generic.go:334] "Generic (PLEG): container finished" podID="9d1ec7bf-1088-4425-8d7e-36112806bc0b" containerID="25c0739d92e36a44edd0b5aaf9c2a2dda811274cb63c527e398c4da941023b85" exitCode=0 Feb 24 00:37:30 crc kubenswrapper[4824]: I0224 00:37:30.527578 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerDied","Data":"25c0739d92e36a44edd0b5aaf9c2a2dda811274cb63c527e398c4da941023b85"} Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.325244 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.398393 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content\") pod \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.398474 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nvc\" (UniqueName: \"kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc\") pod \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.398574 4824 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities\") pod \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\" (UID: \"9d1ec7bf-1088-4425-8d7e-36112806bc0b\") " Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.399915 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities" (OuterVolumeSpecName: "utilities") pod "9d1ec7bf-1088-4425-8d7e-36112806bc0b" (UID: "9d1ec7bf-1088-4425-8d7e-36112806bc0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.408124 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc" (OuterVolumeSpecName: "kube-api-access-88nvc") pod "9d1ec7bf-1088-4425-8d7e-36112806bc0b" (UID: "9d1ec7bf-1088-4425-8d7e-36112806bc0b"). InnerVolumeSpecName "kube-api-access-88nvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.501125 4824 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nvc\" (UniqueName: \"kubernetes.io/projected/9d1ec7bf-1088-4425-8d7e-36112806bc0b-kube-api-access-88nvc\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.501173 4824 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.518171 4824 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d1ec7bf-1088-4425-8d7e-36112806bc0b" (UID: "9d1ec7bf-1088-4425-8d7e-36112806bc0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.543604 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plnc5" event={"ID":"9d1ec7bf-1088-4425-8d7e-36112806bc0b","Type":"ContainerDied","Data":"012fe057c0077558a9201df0455b125e8df328cdf606f15f72a5d15a80b93c69"} Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.543692 4824 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plnc5" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.543702 4824 scope.go:117] "RemoveContainer" containerID="25c0739d92e36a44edd0b5aaf9c2a2dda811274cb63c527e398c4da941023b85" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.577864 4824 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.584510 4824 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-plnc5"] Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.588258 4824 scope.go:117] "RemoveContainer" containerID="f95f8b60bfb86d3d4d9d5f73769d2f2841e4de66351544100b23acaa7e1f0e20" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.602940 4824 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d1ec7bf-1088-4425-8d7e-36112806bc0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 00:37:31 crc kubenswrapper[4824]: I0224 00:37:31.615610 4824 scope.go:117] "RemoveContainer" containerID="d932575d23aee6270236e4e02e8eac17b663f548742031e262b07026c08e0e00" Feb 24 00:37:32 crc kubenswrapper[4824]: I0224 00:37:32.705148 4824 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1ec7bf-1088-4425-8d7e-36112806bc0b" path="/var/lib/kubelet/pods/9d1ec7bf-1088-4425-8d7e-36112806bc0b/volumes" Feb 24 00:38:53 crc kubenswrapper[4824]: I0224 00:38:53.275956 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:38:53 crc kubenswrapper[4824]: I0224 00:38:53.276633 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:39:23 crc kubenswrapper[4824]: I0224 00:39:23.276646 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:39:23 crc kubenswrapper[4824]: I0224 00:39:23.277461 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.276763 4824 patch_prober.go:28] interesting pod/machine-config-daemon-vcbgn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.277344 4824 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.277404 4824 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.278486 4824 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d6f8ce5501722862dc8ed78387b85d7725f9ecfe5b1eca0592c5b8a2bb70509"} pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.278655 4824 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" podUID="939ca085-9383-42e6-b7d6-37f101137273" containerName="machine-config-daemon" containerID="cri-o://2d6f8ce5501722862dc8ed78387b85d7725f9ecfe5b1eca0592c5b8a2bb70509" gracePeriod=600 Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.851207 4824 generic.go:334] "Generic (PLEG): container finished" podID="939ca085-9383-42e6-b7d6-37f101137273" containerID="2d6f8ce5501722862dc8ed78387b85d7725f9ecfe5b1eca0592c5b8a2bb70509" exitCode=0 Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.851289 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerDied","Data":"2d6f8ce5501722862dc8ed78387b85d7725f9ecfe5b1eca0592c5b8a2bb70509"} Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.851754 4824 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vcbgn" event={"ID":"939ca085-9383-42e6-b7d6-37f101137273","Type":"ContainerStarted","Data":"757d7a71d70b24c3eab67d48983e4684df68406250689d9f983c164d729b2fb5"} Feb 24 00:39:53 crc kubenswrapper[4824]: I0224 00:39:53.851784 4824 scope.go:117] "RemoveContainer" containerID="1fadecb8108bb82124c951985733cccd456bd51e1965c7fbca274984d7901364" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147171572024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147171573017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015147165211016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015147165211015460 5ustar corecore